2023-04-22 21:08:23.494 Hail: WARN: This Hail JAR was compiled for Spark 3.3.0, running with Spark 3.3.2. Compatibility is not guaranteed. 2023-04-22 21:08:23.511 SparkContext: INFO: Running Spark version 3.3.2 2023-04-22 21:08:23.575 ResourceUtils: INFO: ============================================================== 2023-04-22 21:08:23.575 ResourceUtils: INFO: No custom resources configured for spark.driver. 2023-04-22 21:08:23.575 ResourceUtils: INFO: ============================================================== 2023-04-22 21:08:23.576 SparkContext: INFO: Submitted application: Hail 2023-04-22 21:08:23.593 SparkContext: INFO: Spark configuration: spark.app.name=Hail spark.app.startTime=1682197703510 spark.app.submitTime=1682197703118 spark.driver.extraClassPath=/home/unix/aburns/.local/lib/python3.9/site-packages/hail/backend/hail-all-spark.jar spark.driver.extraJavaOptions=-XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED spark.driver.maxResultSize=0 spark.driver.memory=50g spark.executor.extraClassPath=./hail-all-spark.jar spark.executor.extraJavaOptions=-XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.concurrent=ALL-UNNAMED --add-opens=java.base/java.util.concurrent.atomic=ALL-UNNAMED --add-opens=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/sun.nio.cs=ALL-UNNAMED --add-opens=java.base/sun.security.action=ALL-UNNAMED --add-opens=java.base/sun.util.calendar=ALL-UNNAMED --add-opens=java.security.jgss/sun.security.krb5=ALL-UNNAMED spark.hadoop.io.compression.codecs=org.apache.hadoop.io.compress.DefaultCodec,is.hail.io.compress.BGzipCodec,is.hail.io.compress.BGzipCodecTbi,org.apache.hadoop.io.compress.GzipCodec spark.hadoop.mapreduce.input.fileinputformat.split.minsize=0 spark.jars=file:///home/unix/aburns/.local/lib/python3.9/site-packages/hail/backend/hail-all-spark.jar spark.kryo.registrator=is.hail.kryo.HailKryoRegistrator spark.kryoserializer.buffer.max=1g spark.logConf=true spark.master=local[*] spark.repl.local.jars=file:///home/unix/aburns/.local/lib/python3.9/site-packages/hail/backend/hail-all-spark.jar spark.serializer=org.apache.spark.serializer.KryoSerializer spark.submit.deployMode=client spark.submit.pyFiles= spark.ui.showConsoleProgress=false 2023-04-22 21:08:23.631 ResourceProfile: INFO: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 2023-04-22 21:08:23.652 ResourceProfile: INFO: Limiting resource is cpu 2023-04-22 21:08:23.653 ResourceProfileManager: INFO: Added ResourceProfile id: 0 2023-04-22 21:08:23.751 SecurityManager: INFO: Changing view acls to: aburns 2023-04-22 21:08:23.752 SecurityManager: INFO: Changing modify acls to: aburns 2023-04-22 21:08:23.752 SecurityManager: INFO: Changing view acls groups to: 2023-04-22 21:08:23.753 SecurityManager: INFO: Changing modify acls groups to: 2023-04-22 21:08:23.753 SecurityManager: INFO: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(aburns); groups with view permissions: Set(); users with modify permissions: Set(aburns); groups with modify permissions: Set() 2023-04-22 21:08:24.290 Utils: INFO: Successfully started service 'sparkDriver' on port 33543. 2023-04-22 21:08:24.429 SparkEnv: INFO: Registering MapOutputTracker 2023-04-22 21:08:24.499 SparkEnv: INFO: Registering BlockManagerMaster 2023-04-22 21:08:24.531 BlockManagerMasterEndpoint: INFO: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2023-04-22 21:08:24.532 BlockManagerMasterEndpoint: INFO: BlockManagerMasterEndpoint up 2023-04-22 21:08:24.540 SparkEnv: INFO: Registering BlockManagerMasterHeartbeat 2023-04-22 21:08:24.581 DiskBlockManager: INFO: Created local directory at /tmp/blockmgr-c7fbbdda-2e71-412f-9dea-20fcd13c6de2 2023-04-22 21:08:24.637 MemoryStore: INFO: MemoryStore started with capacity 28.8 GiB 2023-04-22 21:08:24.695 SparkEnv: INFO: Registering OutputCommitCoordinator 2023-04-22 21:08:24.802 log: INFO: Logging initialized @5437ms to org.sparkproject.jetty.util.log.Slf4jLog 2023-04-22 21:08:25.132 Server: INFO: jetty-9.4.48.v20220622; built: 2022-06-21T20:42:25.880Z; git: 6b67c5719d1f4371b33655ff2d047d24e171e49a; jvm 1.8.0_345-b01 2023-04-22 21:08:25.297 Server: INFO: Started @5933ms 2023-04-22 21:08:25.363 AbstractConnector: INFO: Started ServerConnector@268edf65{HTTP/1.1, (http/1.1)}{0.0.0.0:4040} 2023-04-22 21:08:25.363 Utils: INFO: Successfully started service 'SparkUI' on port 4040. 2023-04-22 21:08:25.440 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@6c41559b{/,null,AVAILABLE,@Spark} 2023-04-22 21:08:25.465 SparkContext: INFO: Added JAR file:///home/unix/aburns/.local/lib/python3.9/site-packages/hail/backend/hail-all-spark.jar at spark://uger-c010.broadinstitute.org:33543/jars/hail-all-spark.jar with timestamp 1682197703510 2023-04-22 21:08:25.705 Executor: INFO: Starting executor ID driver on host uger-c010.broadinstitute.org 2023-04-22 21:08:25.714 Executor: INFO: Starting executor with user classpath (userClassPathFirst = false): 'file:/home/unix/aburns/./hail-all-spark.jar,file:/home/unix/aburns/hail-all-spark.jar' 2023-04-22 21:08:25.794 Utils: INFO: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 46121. 2023-04-22 21:08:25.794 NettyBlockTransferService: INFO: Server created on uger-c010.broadinstitute.org:46121 2023-04-22 21:08:25.796 BlockManager: INFO: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 2023-04-22 21:08:25.806 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:08:25.811 BlockManagerMasterEndpoint: INFO: Registering block manager uger-c010.broadinstitute.org:46121 with 28.8 GiB RAM, BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:08:25.815 BlockManagerMaster: INFO: Registered BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:08:25.816 BlockManager: INFO: Initialized BlockManager: BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:08:26.333 ContextHandler: INFO: Stopped o.s.j.s.ServletContextHandler@6c41559b{/,null,STOPPED,@Spark} 2023-04-22 21:08:26.355 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@7b9cb1c1{/jobs,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.356 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@4bdd0bd9{/jobs/json,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.357 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@5b1b5028{/jobs/job,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.358 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@69dc3eb2{/jobs/job/json,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.359 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@2218796{/stages,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.360 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@2a416640{/stages/json,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.368 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@736a7be3{/stages/stage,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.369 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@64c2a0c4{/stages/stage/json,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.369 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@55da0c83{/stages/pool,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.370 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@ebd8474{/stages/pool/json,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.371 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@e7c1a1{/storage,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.372 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@5bd44d4e{/storage/json,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.373 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@36671f4{/storage/rdd,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.374 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@476fcde{/storage/rdd/json,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.375 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@1b1bcfa2{/environment,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.376 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@5effd49b{/environment/json,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.377 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@2735351e{/executors,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.422 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@83255c1{/executors/json,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.425 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@50e0886b{/executors/threadDump,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.426 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@2a7549c5{/executors/threadDump/json,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.440 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@3e6a21b6{/static,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.440 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@1573aca4{/,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.443 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@5cf71bea{/api,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.458 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@8d576f0{/jobs/job/kill,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.459 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@29fc177d{/stages/stage/kill,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.465 ContextHandler: INFO: Started o.s.j.s.ServletContextHandler@4942c373{/metrics/json,null,AVAILABLE,@Spark} 2023-04-22 21:08:26.501 Hail: INFO: SparkUI: http://uger-c010.broadinstitute.org:4040 2023-04-22 21:08:26.965 Hail: INFO: Running Hail version 0.2.113-cf32652c5077 2023-04-22 21:08:28.527 SparkContext: WARN: Using an existing SparkContext; some configuration may not take effect. 2023-04-22 21:08:32.815 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:08:35.427 Hail: INFO: Found 4151 samples in fam file. 2023-04-22 21:08:35.430 Hail: INFO: Found 114591 variants in bim file. 2023-04-22 21:08:35.644 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:08:35.645 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:08:35.647 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:08:35.648 : INFO: RegionPool: FREE: 64.0K allocated (64.0K blocks / 0 chunks), regions.size = 1, 0 current java objects, thread 14: Thread-5 2023-04-22 21:08:35.649 : INFO: timing SparkBackend.parse_matrix_ir total 2.838s self 2.838s children 0.000ms %children 0.00% 2023-04-22 21:08:50.333 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:08:51.138 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:08:51.138 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:08:51.139 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:08:51.139 : INFO: RegionPool: FREE: 64.0K allocated (64.0K blocks / 0 chunks), regions.size = 1, 0 current java objects, thread 14: Thread-5 2023-04-22 21:08:51.139 : INFO: timing SparkBackend.parse_value_ir total 806.638ms self 806.638ms children 0.000ms %children 0.00% 2023-04-22 21:08:51.142 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:08:51.144 : INFO: starting execution of query hail_query_1 of initial size 46 2023-04-22 21:08:51.450 : INFO: initial IR: IR size 46: (Let __rng_state (RNGStateLiteral) (TableCount (MatrixRowsTable (MatrixFilterRows (MatrixMapRows (MatrixMapEntries (MatrixRead Matrix{global:Struct{},col_key:[s],col:Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean},row_key:[[locus,alleles]],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64},entry:Struct{GT:Call}} False False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (InsertFields (SelectFields () (SelectFields (GT) (Ref g))) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref g)))))) (AggLet __cse_1 False (GetField __gt (Ref g)) (InsertFields (SelectFields (locus alleles rsid cm_position) (Ref va)) None (__AC (ApplyAggOp Sum () ((ApplyIR 2 toInt64 () Int64 (Ref __cse_1))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __cse_1)))))))))) (Let __cse_2 (GetField __AC (Ref va)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __cse_2) (ApplyIR 4 toInt64 () Int64 (I32 0))) (ApplyComparisonOp LT (Ref __cse_2) (ApplyBinaryPrimOp Multiply (ApplyIR 5 toInt64 () Int64 (I32 2)) (GetField __n_called (Ref va))))) (False))))))) 2023-04-22 21:08:51.849 : INFO: after optimize: relationalLowerer, initial IR: IR size 40: (TableCount (TableFilter (MatrixRowsTable (MatrixMapRows (MatrixMapEntries (MatrixRead Matrix{global:Struct{},col_key:[s],col:Struct{s:String},row_key:[[locus,alleles]],row:Struct{locus:Locus(GRCh38),alleles:Array[String]},entry:Struct{GT:Call}} False False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (InsertFields (SelectFields () (Ref g)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref g)))))) (AggLet __iruid_10 False (GetField __gt (Ref g)) (InsertFields (Ref va) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_10))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_10))))))))))) (Let __iruid_11 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_11) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_11) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False))))) 2023-04-22 21:08:51.989 : INFO: after LowerMatrixToTable: IR size 93: (TableCount (TableFilter (TableMapRows (TableMapGlobals (TableMapRows (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (InsertFields (Ref row) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamZip -1 AssumeSameLength (g sa) (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (ToStream False (GetField __cols (Ref global))) (InsertFields (SelectFields () (Ref g)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref g)))))))))) (Let n_cols (ArrayLen (GetField __cols (Ref global))) (InsertFields (Let __iruid_12 (MakeStruct) (StreamAgg i (ToStream False (ToArray (StreamFilter i (ToStream False (ToArray (StreamRange -1 False (I32 0) (ArrayLen (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (I32 1)))) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row)) (Ref i))))))) (AggLet sa False (ArrayRef -1 (GetField __cols (Ref global)) (Ref i)) (AggLet g False (ArrayRef -1 (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row)) (Ref i)) (AggLet __iruid_10 False (GetField __gt (Ref g)) (InsertFields (SelectFields (locus alleles) (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_10))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_10))))))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row)))))) (SelectFields () (Ref global))) (SelectFields (locus alleles __AC __n_called) (Ref row))) (Let __iruid_11 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_11) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_11) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False))))) 2023-04-22 21:08:52.175 : INFO: Prune: InsertFields: eliminating field 'the entries! [877f12a8827e18f61222c6c8c5fb04a8]' 2023-04-22 21:08:52.308 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 60: (TableCount (TableFilter (TableMapRows (TableKeyBy () False (TableRead Table{global:Struct{},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) (Let __iruid_38 (ToArray (StreamMap __iruid_39 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (InsertFields (SelectFields () (Ref __iruid_39)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_39))))))) (StreamAgg __iruid_40 (StreamFilter __iruid_41 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_38)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_38) (Ref __iruid_41))))) (AggLet __iruid_42 False (GetField __gt (ArrayRef -1 (Ref __iruid_38) (Ref __iruid_40))) (InsertFields (SelectFields () (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_42))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_42)))))))))))) (Let __iruid_43 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_43) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_43) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False))))) 2023-04-22 21:08:52.368 : INFO: after LiftRelationalValuesToRelationalLets: IR size 62: (RelationalLet __iruid_44 (TableCount (TableFilter (TableMapRows (TableKeyBy () False (TableRead Table{global:Struct{},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) (Let __iruid_38 (ToArray (StreamMap __iruid_39 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (InsertFields (SelectFields () (Ref __iruid_39)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_39))))))) (StreamAgg __iruid_40 (StreamFilter __iruid_41 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_38)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_38) (Ref __iruid_41))))) (AggLet __iruid_42 False (GetField __gt (ArrayRef -1 (Ref __iruid_38) (Ref __iruid_40))) (InsertFields (SelectFields () (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_42))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_42)))))))))))) (Let __iruid_43 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_43) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_43) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False))))) (RelationalRef __iruid_44 Int64)) 2023-04-22 21:08:52.377 : INFO: initial IR: IR size 60: (TableCount (TableFilter (TableMapRows (TableKeyBy () False (TableRead Table{global:Struct{},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) (Let __iruid_38 (ToArray (StreamMap __iruid_39 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (InsertFields (SelectFields () (Ref __iruid_39)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_39))))))) (StreamAgg __iruid_40 (StreamFilter __iruid_41 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_38)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_38) (Ref __iruid_41))))) (AggLet __iruid_42 False (GetField __gt (ArrayRef -1 (Ref __iruid_38) (Ref __iruid_40))) (InsertFields (SelectFields () (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_42))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_42)))))))))))) (Let __iruid_43 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_43) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_43) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False))))) 2023-04-22 21:08:52.432 : INFO: after LowerAndExecuteShuffles: IR size 60: (TableCount (TableFilter (TableMapRows (TableKeyBy () False (TableRead Table{global:Struct{},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) (Let __iruid_38 (ToArray (StreamMap __iruid_39 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (InsertFields (SelectFields () (Ref __iruid_39)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_39))))))) (StreamAgg __iruid_40 (StreamFilter __iruid_41 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_38)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_38) (Ref __iruid_41))))) (AggLet __iruid_42 False (GetField __gt (ArrayRef -1 (Ref __iruid_38) (Ref __iruid_40))) (InsertFields (SelectFields () (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_42))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_42)))))))))))) (Let __iruid_43 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_43) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_43) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False))))) 2023-04-22 21:08:52.607 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 60: (TableCount (TableFilter (TableMapRows (TableKeyBy () False (TableRead Table{global:Struct{},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) (Let __iruid_57 (ToArray (StreamMap __iruid_58 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (InsertFields (SelectFields () (Ref __iruid_58)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_58))))))) (StreamAgg __iruid_59 (StreamFilter __iruid_60 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_57)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_57) (Ref __iruid_60))))) (AggLet __iruid_61 False (GetField __gt (ArrayRef -1 (Ref __iruid_57) (Ref __iruid_59))) (InsertFields (SelectFields () (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_61))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_61)))))))))))) (Let __iruid_62 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_62) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_62) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False))))) 2023-04-22 21:08:52.635 : INFO: LowerOrInterpretNonCompilable: whole stage code generation is a go! 2023-04-22 21:08:52.635 : INFO: lowering result: TableCount 2023-04-22 21:08:53.029 MemoryStore: INFO: Block broadcast_0 stored as values in memory (estimated size 47.3 MiB, free 28.8 GiB) 2023-04-22 21:08:54.651 MemoryStore: INFO: Block broadcast_0_piece0 stored as bytes in memory (estimated size 2.3 MiB, free 28.8 GiB) 2023-04-22 21:08:54.662 BlockManagerInfo: INFO: Added broadcast_0_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 2.3 MiB, free: 28.8 GiB) 2023-04-22 21:08:54.671 SparkContext: INFO: Created broadcast 0 from broadcast at SparkBackend.scala:354 2023-04-22 21:08:54.720 : INFO: compiling and evaluating result: TableCount 2023-04-22 21:08:54.774 : INFO: initial IR: IR size 81: (ApplyIR -1 sum () Int64 (Let __iruid_63 (Literal Struct{} ) (Let __iruid_67 (CollectDistributedArray count_per_partition __iruid_64 __iruid_66 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (MakeStruct (__iruid_63 (Ref __iruid_63))) (Let __iruid_63 (GetField __iruid_63 (Ref __iruid_66)) (Cast Int64 (StreamLen (Let global (Ref __iruid_63) (StreamFilter row (Let global (Ref __iruid_63) (StreamMap __iruid_65 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_64)) (Let row (Ref __iruid_65) (Let __iruid_57 (ToArray (StreamMap __iruid_58 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (InsertFields (SelectFields () (Ref __iruid_58)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_58))))))) (StreamAgg __iruid_59 (StreamFilter __iruid_60 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_57)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_57) (Ref __iruid_60))))) (AggLet __iruid_61 False (GetField __gt (ArrayRef -1 (Ref __iruid_57) (Ref __iruid_59))) (InsertFields (SelectFields () (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_61))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_61)))))))))))))) (Let __iruid_62 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_62) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_62) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False)))))))) (NA String)) (Ref __iruid_67)))) 2023-04-22 21:08:55.013 : INFO: after optimize: relationalLowerer, initial IR: IR size 76: (StreamFold __iruid_107 __iruid_108 (ToStream False (CollectDistributedArray count_per_partition __iruid_109 __iruid_110 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (If (IsNA (Literal Struct{__iruid_63:Struct{}} )) (NA Struct{}) (Literal Struct{} )) (Cast Int64 (StreamLen (StreamFilter __iruid_112 (StreamMap __iruid_113 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_109)) (Let __iruid_114 (ToArray (StreamMap __iruid_115 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_113))) (InsertFields (SelectFields () (Ref __iruid_115)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_115))))))) (StreamAgg __iruid_116 (StreamFilter __iruid_117 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_114)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_114) (Ref __iruid_117))))) (AggLet __iruid_118 False (GetField __gt (ArrayRef -1 (Ref __iruid_114) (Ref __iruid_116))) (InsertFields (SelectFields () (Ref __iruid_113)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_118))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_118)))))))))))) (Let __iruid_119 (GetField __AC (Ref __iruid_112)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_119) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_119) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_112))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_107) (Ref __iruid_108))) 2023-04-22 21:08:55.019 : INFO: after LowerMatrixToTable: IR size 76: (StreamFold __iruid_107 __iruid_108 (ToStream False (CollectDistributedArray count_per_partition __iruid_109 __iruid_110 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (If (IsNA (Literal Struct{__iruid_63:Struct{}} )) (NA Struct{}) (Literal Struct{} )) (Cast Int64 (StreamLen (StreamFilter __iruid_112 (StreamMap __iruid_113 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_109)) (Let __iruid_114 (ToArray (StreamMap __iruid_115 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_113))) (InsertFields (SelectFields () (Ref __iruid_115)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_115))))))) (StreamAgg __iruid_116 (StreamFilter __iruid_117 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_114)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_114) (Ref __iruid_117))))) (AggLet __iruid_118 False (GetField __gt (ArrayRef -1 (Ref __iruid_114) (Ref __iruid_116))) (InsertFields (SelectFields () (Ref __iruid_113)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_118))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_118)))))))))))) (Let __iruid_119 (GetField __AC (Ref __iruid_112)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_119) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_119) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_112))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_107) (Ref __iruid_108))) 2023-04-22 21:08:55.172 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 72: (StreamFold __iruid_144 __iruid_145 (ToStream False (CollectDistributedArray count_per_partition __iruid_146 __iruid_147 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_148 (StreamMap __iruid_149 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_146)) (Let __iruid_150 (ToArray (StreamMap __iruid_151 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_149))) (InsertFields (SelectFields () (Ref __iruid_151)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_151))))))) (StreamAgg __iruid_152 (StreamFilter __iruid_153 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_150)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_150) (Ref __iruid_153))))) (AggLet __iruid_154 False (GetField __gt (ArrayRef -1 (Ref __iruid_150) (Ref __iruid_152))) (InsertFields (SelectFields () (Ref __iruid_149)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_154))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_154)))))))))))) (Let __iruid_155 (GetField __AC (Ref __iruid_148)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_155) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_155) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_148))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_144) (Ref __iruid_145))) 2023-04-22 21:08:55.216 : INFO: after LiftRelationalValuesToRelationalLets: IR size 72: (StreamFold __iruid_144 __iruid_145 (ToStream False (CollectDistributedArray count_per_partition __iruid_146 __iruid_147 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_148 (StreamMap __iruid_149 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_146)) (Let __iruid_150 (ToArray (StreamMap __iruid_151 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_149))) (InsertFields (SelectFields () (Ref __iruid_151)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_151))))))) (StreamAgg __iruid_152 (StreamFilter __iruid_153 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_150)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_150) (Ref __iruid_153))))) (AggLet __iruid_154 False (GetField __gt (ArrayRef -1 (Ref __iruid_150) (Ref __iruid_152))) (InsertFields (SelectFields () (Ref __iruid_149)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_154))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_154)))))))))))) (Let __iruid_155 (GetField __AC (Ref __iruid_148)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_155) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_155) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_148))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_144) (Ref __iruid_145))) 2023-04-22 21:08:55.221 : INFO: after EvalRelationalLets: IR size 72: (StreamFold __iruid_144 __iruid_145 (ToStream False (CollectDistributedArray count_per_partition __iruid_146 __iruid_147 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_148 (StreamMap __iruid_149 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_146)) (Let __iruid_150 (ToArray (StreamMap __iruid_151 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_149))) (InsertFields (SelectFields () (Ref __iruid_151)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_151))))))) (StreamAgg __iruid_152 (StreamFilter __iruid_153 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_150)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_150) (Ref __iruid_153))))) (AggLet __iruid_154 False (GetField __gt (ArrayRef -1 (Ref __iruid_150) (Ref __iruid_152))) (InsertFields (SelectFields () (Ref __iruid_149)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_154))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_154)))))))))))) (Let __iruid_155 (GetField __AC (Ref __iruid_148)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_155) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_155) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_148))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_144) (Ref __iruid_145))) 2023-04-22 21:08:55.227 : INFO: after LowerAndExecuteShuffles: IR size 72: (StreamFold __iruid_144 __iruid_145 (ToStream False (CollectDistributedArray count_per_partition __iruid_146 __iruid_147 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_148 (StreamMap __iruid_149 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_146)) (Let __iruid_150 (ToArray (StreamMap __iruid_151 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_149))) (InsertFields (SelectFields () (Ref __iruid_151)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_151))))))) (StreamAgg __iruid_152 (StreamFilter __iruid_153 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_150)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_150) (Ref __iruid_153))))) (AggLet __iruid_154 False (GetField __gt (ArrayRef -1 (Ref __iruid_150) (Ref __iruid_152))) (InsertFields (SelectFields () (Ref __iruid_149)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_154))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_154)))))))))))) (Let __iruid_155 (GetField __AC (Ref __iruid_148)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_155) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_155) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_148))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_144) (Ref __iruid_145))) 2023-04-22 21:08:55.342 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 72: (StreamFold __iruid_180 __iruid_181 (ToStream False (CollectDistributedArray count_per_partition __iruid_182 __iruid_183 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_184 (StreamMap __iruid_185 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_182)) (Let __iruid_186 (ToArray (StreamMap __iruid_187 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_185))) (InsertFields (SelectFields () (Ref __iruid_187)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_187))))))) (StreamAgg __iruid_188 (StreamFilter __iruid_189 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_186)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_186) (Ref __iruid_189))))) (AggLet __iruid_190 False (GetField __gt (ArrayRef -1 (Ref __iruid_186) (Ref __iruid_188))) (InsertFields (SelectFields () (Ref __iruid_185)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_190))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_190)))))))))))) (Let __iruid_191 (GetField __AC (Ref __iruid_184)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_191) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_191) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_184))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_180) (Ref __iruid_181))) 2023-04-22 21:08:55.395 : INFO: after LowerOrInterpretNonCompilable: IR size 72: (StreamFold __iruid_180 __iruid_181 (ToStream False (CollectDistributedArray count_per_partition __iruid_182 __iruid_183 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_184 (StreamMap __iruid_185 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_182)) (Let __iruid_186 (ToArray (StreamMap __iruid_187 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_185))) (InsertFields (SelectFields () (Ref __iruid_187)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_187))))))) (StreamAgg __iruid_188 (StreamFilter __iruid_189 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_186)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_186) (Ref __iruid_189))))) (AggLet __iruid_190 False (GetField __gt (ArrayRef -1 (Ref __iruid_186) (Ref __iruid_188))) (InsertFields (SelectFields () (Ref __iruid_185)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_190))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_190)))))))))))) (Let __iruid_191 (GetField __AC (Ref __iruid_184)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_191) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_191) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_184))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_180) (Ref __iruid_181))) 2023-04-22 21:08:55.469 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 72: (StreamFold __iruid_216 __iruid_217 (ToStream False (CollectDistributedArray count_per_partition __iruid_218 __iruid_219 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_220 (StreamMap __iruid_221 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_218)) (Let __iruid_222 (ToArray (StreamMap __iruid_223 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_221))) (InsertFields (SelectFields () (Ref __iruid_223)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_223))))))) (StreamAgg __iruid_224 (StreamFilter __iruid_225 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_222)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_222) (Ref __iruid_225))))) (AggLet __iruid_226 False (GetField __gt (ArrayRef -1 (Ref __iruid_222) (Ref __iruid_224))) (InsertFields (SelectFields () (Ref __iruid_221)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_226))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_226)))))))))))) (Let __iruid_227 (GetField __AC (Ref __iruid_220)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_227) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_227) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_220))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_216) (Ref __iruid_217))) 2023-04-22 21:08:55.594 : INFO: initial IR: IR size 73: (MakeTuple (0) (StreamFold __iruid_216 __iruid_217 (ToStream False (CollectDistributedArray count_per_partition __iruid_218 __iruid_219 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_220 (StreamMap __iruid_221 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_218)) (Let __iruid_222 (ToArray (StreamMap __iruid_223 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_221))) (InsertFields (SelectFields () (Ref __iruid_223)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_223))))))) (StreamAgg __iruid_224 (StreamFilter __iruid_225 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_222)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_222) (Ref __iruid_225))))) (AggLet __iruid_226 False (GetField __gt (ArrayRef -1 (Ref __iruid_222) (Ref __iruid_224))) (InsertFields (SelectFields () (Ref __iruid_221)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_226))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_226)))))))))))) (Let __iruid_227 (GetField __AC (Ref __iruid_220)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_227) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_227) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_220))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_216) (Ref __iruid_217)))) 2023-04-22 21:08:55.687 : INFO: after optimize: compileLowerer, initial IR: IR size 73: (MakeTuple (0) (StreamFold __iruid_252 __iruid_253 (ToStream False (CollectDistributedArray count_per_partition __iruid_254 __iruid_255 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_256 (StreamMap __iruid_257 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_254)) (Let __iruid_258 (ToArray (StreamMap __iruid_259 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_257))) (InsertFields (SelectFields () (Ref __iruid_259)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_259))))))) (StreamAgg __iruid_260 (StreamFilter __iruid_261 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_258)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_258) (Ref __iruid_261))))) (AggLet __iruid_262 False (GetField __gt (ArrayRef -1 (Ref __iruid_258) (Ref __iruid_260))) (InsertFields (SelectFields () (Ref __iruid_257)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_262))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_262)))))))))))) (Let __iruid_263 (GetField __AC (Ref __iruid_256)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_263) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_263) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_256))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_252) (Ref __iruid_253)))) 2023-04-22 21:08:55.693 : INFO: after InlineApplyIR: IR size 73: (MakeTuple (0) (StreamFold __iruid_252 __iruid_253 (ToStream False (CollectDistributedArray count_per_partition __iruid_254 __iruid_255 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_256 (StreamMap __iruid_257 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_254)) (Let __iruid_258 (ToArray (StreamMap __iruid_259 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_257))) (InsertFields (SelectFields () (Ref __iruid_259)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_259))))))) (StreamAgg __iruid_260 (StreamFilter __iruid_261 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_258)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_258) (Ref __iruid_261))))) (AggLet __iruid_262 False (GetField __gt (ArrayRef -1 (Ref __iruid_258) (Ref __iruid_260))) (InsertFields (SelectFields () (Ref __iruid_257)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_262))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_262)))))))))))) (Let __iruid_263 (GetField __AC (Ref __iruid_256)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_263) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_263) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_256))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_252) (Ref __iruid_253)))) 2023-04-22 21:08:55.774 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 73: (MakeTuple (0) (StreamFold __iruid_288 __iruid_289 (ToStream False (CollectDistributedArray count_per_partition __iruid_290 __iruid_291 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_292 (StreamMap __iruid_293 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_290)) (Let __iruid_294 (ToArray (StreamMap __iruid_295 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_293))) (InsertFields (SelectFields () (Ref __iruid_295)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_295))))))) (StreamAgg __iruid_296 (StreamFilter __iruid_297 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_294)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_294) (Ref __iruid_297))))) (AggLet __iruid_298 False (GetField __gt (ArrayRef -1 (Ref __iruid_294) (Ref __iruid_296))) (InsertFields (SelectFields () (Ref __iruid_293)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_298))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_298)))))))))))) (Let __iruid_299 (GetField __AC (Ref __iruid_292)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_299) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_299) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_292))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_288) (Ref __iruid_289)))) 2023-04-22 21:08:55.860 : INFO: after LowerArrayAggsToRunAggs: IR size 87: (MakeTuple (0) (StreamFold __iruid_288 __iruid_289 (ToStream False (CollectDistributedArray count_per_partition __iruid_290 __iruid_291 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_292 (StreamMap __iruid_293 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_290)) (Let __iruid_294 (ToArray (StreamMap __iruid_295 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_293))) (InsertFields (SelectFields () (Ref __iruid_295)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_295))))))) (Let __iruid_300 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_296 (StreamFilter __iruid_297 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_294)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_294) (Ref __iruid_297))))) (Let __iruid_298 (GetField __gt (ArrayRef -1 (Ref __iruid_294) (Ref __iruid_296))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_298)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_298)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields () (Ref __iruid_293)) None (__AC (GetTupleElement 0 (Ref __iruid_300))) (__n_called (GetTupleElement 1 (Ref __iruid_300))))))) (Let __iruid_299 (GetField __AC (Ref __iruid_292)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_299) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_299) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_292))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_288) (Ref __iruid_289)))) 2023-04-22 21:08:55.993 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 87: (MakeTuple (0) (StreamFold __iruid_327 __iruid_328 (ToStream False (CollectDistributedArray count_per_partition __iruid_329 __iruid_330 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_331 (StreamMap __iruid_332 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_329)) (Let __iruid_333 (ToArray (StreamMap __iruid_334 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_332))) (InsertFields (SelectFields () (Ref __iruid_334)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_334))))))) (Let __iruid_335 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_336 (StreamFilter __iruid_337 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_333)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_333) (Ref __iruid_337))))) (Let __iruid_338 (GetField __gt (ArrayRef -1 (Ref __iruid_333) (Ref __iruid_336))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_338)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_338)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields () (Ref __iruid_332)) None (__AC (GetTupleElement 0 (Ref __iruid_335))) (__n_called (GetTupleElement 1 (Ref __iruid_335))))))) (Let __iruid_339 (GetField __AC (Ref __iruid_331)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_339) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_339) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_331))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_327) (Ref __iruid_328)))) 2023-04-22 21:08:56.653 : INFO: instruction count: 3: __C14HailClassLoaderContainer. 2023-04-22 21:08:56.654 : INFO: instruction count: 3: __C14HailClassLoaderContainer. 2023-04-22 21:08:56.656 : INFO: instruction count: 3: __C16FSContainer. 2023-04-22 21:08:56.657 : INFO: instruction count: 3: __C16FSContainer. 2023-04-22 21:08:56.790 : INFO: instruction count: 3: __C18collect_distributed_array_count_per_partition. 2023-04-22 21:08:56.792 : INFO: instruction count: 114: __C18collect_distributed_array_count_per_partition.apply 2023-04-22 21:08:56.792 : INFO: instruction count: 17: __C18collect_distributed_array_count_per_partition.apply 2023-04-22 21:08:56.792 : INFO: instruction count: 27: __C18collect_distributed_array_count_per_partition.__m20DECODE_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:08:56.793 : INFO: instruction count: 44: __C18collect_distributed_array_count_per_partition.__m21INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:56.793 : INFO: instruction count: 31: __C18collect_distributed_array_count_per_partition.__m22INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:08:56.793 : INFO: instruction count: 10: __C18collect_distributed_array_count_per_partition.__m23INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:08:56.794 : INFO: instruction count: 27: __C18collect_distributed_array_count_per_partition.__m25DECODE_r_struct_of_r_struct_of_ENDEND_TO_SBaseStructPointer 2023-04-22 21:08:56.794 : INFO: instruction count: 8: __C18collect_distributed_array_count_per_partition.__m26INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:08:56.797 : INFO: instruction count: 298: __C18collect_distributed_array_count_per_partition.__m28split_StreamLen 2023-04-22 21:08:56.799 : INFO: instruction count: 205: __C18collect_distributed_array_count_per_partition.__m37split_ToArray 2023-04-22 21:08:56.799 : INFO: instruction count: 8: __C18collect_distributed_array_count_per_partition.__m45nNonRefAlleles 2023-04-22 21:08:56.800 : INFO: instruction count: 9: __C18collect_distributed_array_count_per_partition.__m57begin_group_0 2023-04-22 21:08:56.800 : INFO: instruction count: 17: __C18collect_distributed_array_count_per_partition.__m58begin_group_0 2023-04-22 21:08:56.801 : INFO: instruction count: 155: __C18collect_distributed_array_count_per_partition.__m59split_StreamFor 2023-04-22 21:08:56.801 : INFO: instruction count: 35: __C18collect_distributed_array_count_per_partition.__m67arrayref_bounds_check 2023-04-22 21:08:56.802 : INFO: instruction count: 73: __C18collect_distributed_array_count_per_partition.__m71begin_group_0 2023-04-22 21:08:56.802 : INFO: instruction count: 5: __C18collect_distributed_array_count_per_partition.__m74toInt64 2023-04-22 21:08:56.803 : INFO: instruction count: 11: __C18collect_distributed_array_count_per_partition.__m85ord_gt 2023-04-22 21:08:56.803 : INFO: instruction count: 16: __C18collect_distributed_array_count_per_partition.__m86ord_gtNonnull 2023-04-22 21:08:56.803 : INFO: instruction count: 11: __C18collect_distributed_array_count_per_partition.__m87ord_lt 2023-04-22 21:08:56.804 : INFO: instruction count: 16: __C18collect_distributed_array_count_per_partition.__m88ord_ltNonnull 2023-04-22 21:08:56.804 : INFO: instruction count: 13: __C18collect_distributed_array_count_per_partition.__m92ENCODE_SBaseStructPointer_TO_r_struct_of_r_int64END 2023-04-22 21:08:56.804 : INFO: instruction count: 4: __C18collect_distributed_array_count_per_partition.__m93ENCODE_SInt64$_TO_r_int64 2023-04-22 21:08:56.804 : INFO: instruction count: 9: __C18collect_distributed_array_count_per_partition.setPartitionIndex 2023-04-22 21:08:56.804 : INFO: instruction count: 4: __C18collect_distributed_array_count_per_partition.addPartitionRegion 2023-04-22 21:08:56.805 : INFO: instruction count: 4: __C18collect_distributed_array_count_per_partition.setPool 2023-04-22 21:08:56.805 : INFO: instruction count: 3: __C18collect_distributed_array_count_per_partition.addHailClassLoader 2023-04-22 21:08:56.805 : INFO: instruction count: 3: __C18collect_distributed_array_count_per_partition.addFS 2023-04-22 21:08:56.805 : INFO: instruction count: 4: __C18collect_distributed_array_count_per_partition.addTaskContext 2023-04-22 21:08:56.805 : INFO: instruction count: 3: __C18collect_distributed_array_count_per_partition.setObjects 2023-04-22 21:08:56.806 : INFO: instruction count: 3: __C89staticWrapperClass_1. 2023-04-22 21:08:56.886 : INFO: encoder cache miss (0 hits, 1 misses, 0.000) 2023-04-22 21:08:56.893 : INFO: instruction count: 3: __C130HailClassLoaderContainer. 2023-04-22 21:08:56.894 : INFO: instruction count: 3: __C130HailClassLoaderContainer. 2023-04-22 21:08:56.920 : INFO: instruction count: 3: __C132FSContainer. 2023-04-22 21:08:56.920 : INFO: instruction count: 3: __C132FSContainer. 2023-04-22 21:08:56.924 : INFO: instruction count: 3: __C134etypeEncode. 2023-04-22 21:08:56.924 : INFO: instruction count: 7: __C134etypeEncode.apply 2023-04-22 21:08:56.924 : INFO: instruction count: 33: __C134etypeEncode.__m136ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_ENDANDr_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND 2023-04-22 21:08:56.925 : INFO: instruction count: 1: __C134etypeEncode.__m137ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:08:56.925 : INFO: instruction count: 35: __C134etypeEncode.__m138ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:56.925 : INFO: instruction count: 49: __C134etypeEncode.__m139ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:56.926 : INFO: instruction count: 16: __C134etypeEncode.__m140ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:08:56.926 : INFO: instruction count: 4: __C134etypeEncode.__m141ENCODE_SInt32$_TO_r_int32 2023-04-22 21:08:56.968 MemoryStore: INFO: Block broadcast_1 stored as values in memory (estimated size 688.0 B, free 28.8 GiB) 2023-04-22 21:08:57.034 MemoryStore: INFO: Block broadcast_1_piece0 stored as bytes in memory (estimated size 217.0 B, free 28.8 GiB) 2023-04-22 21:08:57.037 BlockManagerInfo: INFO: Added broadcast_1_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 217.0 B, free: 28.8 GiB) 2023-04-22 21:08:57.041 SparkContext: INFO: Created broadcast 1 from broadcast at SparkBackend.scala:354 2023-04-22 21:08:57.042 : INFO: instruction count: 3: __C1HailClassLoaderContainer. 2023-04-22 21:08:57.042 : INFO: instruction count: 3: __C1HailClassLoaderContainer. 2023-04-22 21:08:57.043 : INFO: instruction count: 3: __C3FSContainer. 2023-04-22 21:08:57.043 : INFO: instruction count: 3: __C3FSContainer. 2023-04-22 21:08:57.070 : INFO: instruction count: 3: __C5Compiled. 2023-04-22 21:08:57.071 : INFO: instruction count: 25: __C5Compiled.apply 2023-04-22 21:08:57.073 : INFO: instruction count: 343: __C5Compiled.__m7split_StreamFold 2023-04-22 21:08:57.073 : INFO: instruction count: 4: __C5Compiled.setBackend 2023-04-22 21:08:57.073 : INFO: instruction count: 9: __C5Compiled.__m102ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND 2023-04-22 21:08:57.074 : INFO: instruction count: 49: __C5Compiled.__m103ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:57.074 : INFO: instruction count: 16: __C5Compiled.__m104ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:08:57.074 : INFO: instruction count: 4: __C5Compiled.__m105ENCODE_SInt32$_TO_r_int32 2023-04-22 21:08:57.074 : INFO: instruction count: 9: __C5Compiled.__m106ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_ENDEND 2023-04-22 21:08:57.074 : INFO: instruction count: 1: __C5Compiled.__m107ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:08:57.075 : INFO: instruction count: 27: __C5Compiled.__m110DECODE_r_struct_of_r_int64END_TO_SBaseStructPointer 2023-04-22 21:08:57.075 : INFO: instruction count: 10: __C5Compiled.__m111INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:08:57.075 : INFO: instruction count: 9: __C5Compiled.setPartitionIndex 2023-04-22 21:08:57.075 : INFO: instruction count: 4: __C5Compiled.addPartitionRegion 2023-04-22 21:08:57.075 : INFO: instruction count: 4: __C5Compiled.setPool 2023-04-22 21:08:57.076 : INFO: instruction count: 3: __C5Compiled.addHailClassLoader 2023-04-22 21:08:57.076 : INFO: instruction count: 3: __C5Compiled.addFS 2023-04-22 21:08:57.076 : INFO: instruction count: 4: __C5Compiled.addTaskContext 2023-04-22 21:08:57.076 : INFO: instruction count: 3: __C5Compiled.setObjects 2023-04-22 21:08:57.092 : INFO: instruction count: 64: __C5Compiled.addAndDecodeLiterals 2023-04-22 21:08:57.093 : INFO: instruction count: 36: __C5Compiled.__m124DECODE_r_struct_of_r_struct_of_ENDANDr_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:08:57.093 : INFO: instruction count: 8: __C5Compiled.__m125INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:08:57.094 : INFO: instruction count: 58: __C5Compiled.__m126INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:57.094 : INFO: instruction count: 44: __C5Compiled.__m127INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:57.094 : INFO: instruction count: 31: __C5Compiled.__m128INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:08:57.094 : INFO: instruction count: 10: __C5Compiled.__m129INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:08:57.095 : INFO: instruction count: 3: __C108staticWrapperClass_1. 2023-04-22 21:08:57.097 : INFO: initial IR: IR size 73: (MakeTuple (0) (StreamFold __iruid_216 __iruid_217 (ToStream False (CollectDistributedArray count_per_partition __iruid_218 __iruid_219 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_220 (StreamMap __iruid_221 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_218)) (Let __iruid_222 (ToArray (StreamMap __iruid_223 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_221))) (InsertFields (SelectFields () (Ref __iruid_223)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_223))))))) (StreamAgg __iruid_224 (StreamFilter __iruid_225 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_222)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_222) (Ref __iruid_225))))) (AggLet __iruid_226 False (GetField __gt (ArrayRef -1 (Ref __iruid_222) (Ref __iruid_224))) (InsertFields (SelectFields () (Ref __iruid_221)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_226))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_226)))))))))))) (Let __iruid_227 (GetField __AC (Ref __iruid_220)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_227) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_227) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_220))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_216) (Ref __iruid_217)))) 2023-04-22 21:08:57.163 : INFO: after optimize: compileLowerer, initial IR: IR size 73: (MakeTuple (0) (StreamFold __iruid_365 __iruid_366 (ToStream False (CollectDistributedArray count_per_partition __iruid_367 __iruid_368 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_369 (StreamMap __iruid_370 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_367)) (Let __iruid_371 (ToArray (StreamMap __iruid_372 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_370))) (InsertFields (SelectFields () (Ref __iruid_372)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_372))))))) (StreamAgg __iruid_373 (StreamFilter __iruid_374 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_371)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_371) (Ref __iruid_374))))) (AggLet __iruid_375 False (GetField __gt (ArrayRef -1 (Ref __iruid_371) (Ref __iruid_373))) (InsertFields (SelectFields () (Ref __iruid_370)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_375))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_375)))))))))))) (Let __iruid_376 (GetField __AC (Ref __iruid_369)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_376) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_376) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_369))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_365) (Ref __iruid_366)))) 2023-04-22 21:08:57.166 : INFO: after InlineApplyIR: IR size 73: (MakeTuple (0) (StreamFold __iruid_365 __iruid_366 (ToStream False (CollectDistributedArray count_per_partition __iruid_367 __iruid_368 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_369 (StreamMap __iruid_370 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_367)) (Let __iruid_371 (ToArray (StreamMap __iruid_372 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_370))) (InsertFields (SelectFields () (Ref __iruid_372)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_372))))))) (StreamAgg __iruid_373 (StreamFilter __iruid_374 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_371)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_371) (Ref __iruid_374))))) (AggLet __iruid_375 False (GetField __gt (ArrayRef -1 (Ref __iruid_371) (Ref __iruid_373))) (InsertFields (SelectFields () (Ref __iruid_370)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_375))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_375)))))))))))) (Let __iruid_376 (GetField __AC (Ref __iruid_369)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_376) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_376) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_369))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_365) (Ref __iruid_366)))) 2023-04-22 21:08:57.226 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 73: (MakeTuple (0) (StreamFold __iruid_401 __iruid_402 (ToStream False (CollectDistributedArray count_per_partition __iruid_403 __iruid_404 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_405 (StreamMap __iruid_406 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_403)) (Let __iruid_407 (ToArray (StreamMap __iruid_408 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_406))) (InsertFields (SelectFields () (Ref __iruid_408)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_408))))))) (StreamAgg __iruid_409 (StreamFilter __iruid_410 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_407)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_407) (Ref __iruid_410))))) (AggLet __iruid_411 False (GetField __gt (ArrayRef -1 (Ref __iruid_407) (Ref __iruid_409))) (InsertFields (SelectFields () (Ref __iruid_406)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_411))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_411)))))))))))) (Let __iruid_412 (GetField __AC (Ref __iruid_405)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_412) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_412) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_405))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_401) (Ref __iruid_402)))) 2023-04-22 21:08:57.241 : INFO: after LowerArrayAggsToRunAggs: IR size 87: (MakeTuple (0) (StreamFold __iruid_401 __iruid_402 (ToStream False (CollectDistributedArray count_per_partition __iruid_403 __iruid_404 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_405 (StreamMap __iruid_406 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_403)) (Let __iruid_407 (ToArray (StreamMap __iruid_408 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_406))) (InsertFields (SelectFields () (Ref __iruid_408)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_408))))))) (Let __iruid_413 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_409 (StreamFilter __iruid_410 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_407)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_407) (Ref __iruid_410))))) (Let __iruid_411 (GetField __gt (ArrayRef -1 (Ref __iruid_407) (Ref __iruid_409))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_411)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_411)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields () (Ref __iruid_406)) None (__AC (GetTupleElement 0 (Ref __iruid_413))) (__n_called (GetTupleElement 1 (Ref __iruid_413))))))) (Let __iruid_412 (GetField __AC (Ref __iruid_405)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_412) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_412) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_405))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_401) (Ref __iruid_402)))) 2023-04-22 21:08:57.352 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 87: (MakeTuple (0) (StreamFold __iruid_440 __iruid_441 (ToStream False (CollectDistributedArray count_per_partition __iruid_442 __iruid_443 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_444 (StreamMap __iruid_445 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_442)) (Let __iruid_446 (ToArray (StreamMap __iruid_447 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_445))) (InsertFields (SelectFields () (Ref __iruid_447)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_447))))))) (Let __iruid_448 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_449 (StreamFilter __iruid_450 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_446)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_446) (Ref __iruid_450))))) (Let __iruid_451 (GetField __gt (ArrayRef -1 (Ref __iruid_446) (Ref __iruid_449))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_451)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_451)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields () (Ref __iruid_445)) None (__AC (GetTupleElement 0 (Ref __iruid_448))) (__n_called (GetTupleElement 1 (Ref __iruid_448))))))) (Let __iruid_452 (GetField __AC (Ref __iruid_444)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_452) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_452) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_444))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_440) (Ref __iruid_441)))) 2023-04-22 21:08:57.476 : INFO: instruction count: 3: __C155HailClassLoaderContainer. 2023-04-22 21:08:57.477 : INFO: instruction count: 3: __C155HailClassLoaderContainer. 2023-04-22 21:08:57.477 : INFO: instruction count: 3: __C157FSContainer. 2023-04-22 21:08:57.477 : INFO: instruction count: 3: __C157FSContainer. 2023-04-22 21:08:57.504 : INFO: instruction count: 3: __C159collect_distributed_array_count_per_partition. 2023-04-22 21:08:57.505 : INFO: instruction count: 114: __C159collect_distributed_array_count_per_partition.apply 2023-04-22 21:08:57.505 : INFO: instruction count: 17: __C159collect_distributed_array_count_per_partition.apply 2023-04-22 21:08:57.506 : INFO: instruction count: 27: __C159collect_distributed_array_count_per_partition.__m161DECODE_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:08:57.506 : INFO: instruction count: 44: __C159collect_distributed_array_count_per_partition.__m162INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:57.506 : INFO: instruction count: 31: __C159collect_distributed_array_count_per_partition.__m163INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:08:57.506 : INFO: instruction count: 10: __C159collect_distributed_array_count_per_partition.__m164INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:08:57.506 : INFO: instruction count: 27: __C159collect_distributed_array_count_per_partition.__m166DECODE_r_struct_of_r_struct_of_ENDEND_TO_SBaseStructPointer 2023-04-22 21:08:57.507 : INFO: instruction count: 8: __C159collect_distributed_array_count_per_partition.__m167INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:08:57.508 : INFO: instruction count: 298: __C159collect_distributed_array_count_per_partition.__m169split_StreamLen 2023-04-22 21:08:57.521 : INFO: instruction count: 205: __C159collect_distributed_array_count_per_partition.__m178split_ToArray 2023-04-22 21:08:57.521 : INFO: instruction count: 8: __C159collect_distributed_array_count_per_partition.__m186nNonRefAlleles 2023-04-22 21:08:57.521 : INFO: instruction count: 9: __C159collect_distributed_array_count_per_partition.__m198begin_group_0 2023-04-22 21:08:57.522 : INFO: instruction count: 17: __C159collect_distributed_array_count_per_partition.__m199begin_group_0 2023-04-22 21:08:57.522 : INFO: instruction count: 155: __C159collect_distributed_array_count_per_partition.__m200split_StreamFor 2023-04-22 21:08:57.523 : INFO: instruction count: 35: __C159collect_distributed_array_count_per_partition.__m208arrayref_bounds_check 2023-04-22 21:08:57.523 : INFO: instruction count: 73: __C159collect_distributed_array_count_per_partition.__m212begin_group_0 2023-04-22 21:08:57.523 : INFO: instruction count: 5: __C159collect_distributed_array_count_per_partition.__m215toInt64 2023-04-22 21:08:57.523 : INFO: instruction count: 11: __C159collect_distributed_array_count_per_partition.__m226ord_gt 2023-04-22 21:08:57.524 : INFO: instruction count: 16: __C159collect_distributed_array_count_per_partition.__m227ord_gtNonnull 2023-04-22 21:08:57.524 : INFO: instruction count: 11: __C159collect_distributed_array_count_per_partition.__m228ord_lt 2023-04-22 21:08:57.524 : INFO: instruction count: 16: __C159collect_distributed_array_count_per_partition.__m229ord_ltNonnull 2023-04-22 21:08:57.524 : INFO: instruction count: 13: __C159collect_distributed_array_count_per_partition.__m233ENCODE_SBaseStructPointer_TO_r_struct_of_r_int64END 2023-04-22 21:08:57.524 : INFO: instruction count: 4: __C159collect_distributed_array_count_per_partition.__m234ENCODE_SInt64$_TO_r_int64 2023-04-22 21:08:57.524 : INFO: instruction count: 9: __C159collect_distributed_array_count_per_partition.setPartitionIndex 2023-04-22 21:08:57.525 : INFO: instruction count: 4: __C159collect_distributed_array_count_per_partition.addPartitionRegion 2023-04-22 21:08:57.525 : INFO: instruction count: 4: __C159collect_distributed_array_count_per_partition.setPool 2023-04-22 21:08:57.525 : INFO: instruction count: 3: __C159collect_distributed_array_count_per_partition.addHailClassLoader 2023-04-22 21:08:57.525 : INFO: instruction count: 3: __C159collect_distributed_array_count_per_partition.addFS 2023-04-22 21:08:57.525 : INFO: instruction count: 4: __C159collect_distributed_array_count_per_partition.addTaskContext 2023-04-22 21:08:57.525 : INFO: instruction count: 3: __C159collect_distributed_array_count_per_partition.setObjects 2023-04-22 21:08:57.526 : INFO: instruction count: 3: __C230staticWrapperClass_1. 2023-04-22 21:08:57.566 : INFO: encoder cache hit 2023-04-22 21:08:57.568 MemoryStore: INFO: Block broadcast_2 stored as values in memory (estimated size 688.0 B, free 28.8 GiB) 2023-04-22 21:08:57.570 MemoryStore: INFO: Block broadcast_2_piece0 stored as bytes in memory (estimated size 217.0 B, free 28.8 GiB) 2023-04-22 21:08:57.575 BlockManagerInfo: INFO: Added broadcast_2_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 217.0 B, free: 28.8 GiB) 2023-04-22 21:08:57.576 SparkContext: INFO: Created broadcast 2 from broadcast at SparkBackend.scala:354 2023-04-22 21:08:57.577 : INFO: instruction count: 3: __C142HailClassLoaderContainer. 2023-04-22 21:08:57.577 : INFO: instruction count: 3: __C142HailClassLoaderContainer. 2023-04-22 21:08:57.577 : INFO: instruction count: 3: __C144FSContainer. 2023-04-22 21:08:57.577 : INFO: instruction count: 3: __C144FSContainer. 2023-04-22 21:08:57.584 : INFO: instruction count: 3: __C146Compiled. 2023-04-22 21:08:57.584 : INFO: instruction count: 25: __C146Compiled.apply 2023-04-22 21:08:57.586 : INFO: instruction count: 343: __C146Compiled.__m148split_StreamFold 2023-04-22 21:08:57.609 : INFO: instruction count: 4: __C146Compiled.setBackend 2023-04-22 21:08:57.609 : INFO: instruction count: 9: __C146Compiled.__m243ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND 2023-04-22 21:08:57.610 : INFO: instruction count: 49: __C146Compiled.__m244ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:57.610 : INFO: instruction count: 16: __C146Compiled.__m245ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:08:57.610 : INFO: instruction count: 4: __C146Compiled.__m246ENCODE_SInt32$_TO_r_int32 2023-04-22 21:08:57.610 : INFO: instruction count: 9: __C146Compiled.__m247ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_ENDEND 2023-04-22 21:08:57.610 : INFO: instruction count: 1: __C146Compiled.__m248ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:08:57.610 : INFO: instruction count: 27: __C146Compiled.__m251DECODE_r_struct_of_r_int64END_TO_SBaseStructPointer 2023-04-22 21:08:57.611 : INFO: instruction count: 10: __C146Compiled.__m252INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:08:57.611 : INFO: instruction count: 9: __C146Compiled.setPartitionIndex 2023-04-22 21:08:57.611 : INFO: instruction count: 4: __C146Compiled.addPartitionRegion 2023-04-22 21:08:57.611 : INFO: instruction count: 4: __C146Compiled.setPool 2023-04-22 21:08:57.611 : INFO: instruction count: 3: __C146Compiled.addHailClassLoader 2023-04-22 21:08:57.611 : INFO: instruction count: 3: __C146Compiled.addFS 2023-04-22 21:08:57.611 : INFO: instruction count: 4: __C146Compiled.addTaskContext 2023-04-22 21:08:57.611 : INFO: instruction count: 3: __C146Compiled.setObjects 2023-04-22 21:08:57.612 : INFO: instruction count: 64: __C146Compiled.addAndDecodeLiterals 2023-04-22 21:08:57.612 : INFO: instruction count: 36: __C146Compiled.__m265DECODE_r_struct_of_r_struct_of_ENDANDr_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:08:57.612 : INFO: instruction count: 8: __C146Compiled.__m266INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:08:57.613 : INFO: instruction count: 58: __C146Compiled.__m267INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:57.618 : INFO: instruction count: 44: __C146Compiled.__m268INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:57.618 : INFO: instruction count: 31: __C146Compiled.__m269INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:08:57.618 : INFO: instruction count: 10: __C146Compiled.__m270INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:08:57.619 : INFO: instruction count: 3: __C249staticWrapperClass_1. 2023-04-22 21:08:57.621 : INFO: initial IR: IR size 73: (MakeTuple (0) (StreamFold __iruid_216 __iruid_217 (ToStream False (CollectDistributedArray count_per_partition __iruid_218 __iruid_219 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_220 (StreamMap __iruid_221 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_218)) (Let __iruid_222 (ToArray (StreamMap __iruid_223 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_221))) (InsertFields (SelectFields () (Ref __iruid_223)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_223))))))) (StreamAgg __iruid_224 (StreamFilter __iruid_225 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_222)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_222) (Ref __iruid_225))))) (AggLet __iruid_226 False (GetField __gt (ArrayRef -1 (Ref __iruid_222) (Ref __iruid_224))) (InsertFields (SelectFields () (Ref __iruid_221)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_226))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_226)))))))))))) (Let __iruid_227 (GetField __AC (Ref __iruid_220)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_227) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_227) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_220))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_216) (Ref __iruid_217)))) 2023-04-22 21:08:57.679 : INFO: after optimize: compileLowerer, initial IR: IR size 73: (MakeTuple (0) (StreamFold __iruid_478 __iruid_479 (ToStream False (CollectDistributedArray count_per_partition __iruid_480 __iruid_481 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_482 (StreamMap __iruid_483 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_480)) (Let __iruid_484 (ToArray (StreamMap __iruid_485 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_483))) (InsertFields (SelectFields () (Ref __iruid_485)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_485))))))) (StreamAgg __iruid_486 (StreamFilter __iruid_487 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_484)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_484) (Ref __iruid_487))))) (AggLet __iruid_488 False (GetField __gt (ArrayRef -1 (Ref __iruid_484) (Ref __iruid_486))) (InsertFields (SelectFields () (Ref __iruid_483)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_488))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_488)))))))))))) (Let __iruid_489 (GetField __AC (Ref __iruid_482)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_489) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_489) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_482))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_478) (Ref __iruid_479)))) 2023-04-22 21:08:57.682 : INFO: after InlineApplyIR: IR size 73: (MakeTuple (0) (StreamFold __iruid_478 __iruid_479 (ToStream False (CollectDistributedArray count_per_partition __iruid_480 __iruid_481 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_482 (StreamMap __iruid_483 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_480)) (Let __iruid_484 (ToArray (StreamMap __iruid_485 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_483))) (InsertFields (SelectFields () (Ref __iruid_485)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_485))))))) (StreamAgg __iruid_486 (StreamFilter __iruid_487 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_484)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_484) (Ref __iruid_487))))) (AggLet __iruid_488 False (GetField __gt (ArrayRef -1 (Ref __iruid_484) (Ref __iruid_486))) (InsertFields (SelectFields () (Ref __iruid_483)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_488))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_488)))))))))))) (Let __iruid_489 (GetField __AC (Ref __iruid_482)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_489) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_489) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_482))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_478) (Ref __iruid_479)))) 2023-04-22 21:08:57.729 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 73: (MakeTuple (0) (StreamFold __iruid_514 __iruid_515 (ToStream False (CollectDistributedArray count_per_partition __iruid_516 __iruid_517 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_518 (StreamMap __iruid_519 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_516)) (Let __iruid_520 (ToArray (StreamMap __iruid_521 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_519))) (InsertFields (SelectFields () (Ref __iruid_521)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_521))))))) (StreamAgg __iruid_522 (StreamFilter __iruid_523 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_520)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_520) (Ref __iruid_523))))) (AggLet __iruid_524 False (GetField __gt (ArrayRef -1 (Ref __iruid_520) (Ref __iruid_522))) (InsertFields (SelectFields () (Ref __iruid_519)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_524))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_524)))))))))))) (Let __iruid_525 (GetField __AC (Ref __iruid_518)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_525) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_525) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_518))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_514) (Ref __iruid_515)))) 2023-04-22 21:08:57.774 : INFO: after LowerArrayAggsToRunAggs: IR size 87: (MakeTuple (0) (StreamFold __iruid_514 __iruid_515 (ToStream False (CollectDistributedArray count_per_partition __iruid_516 __iruid_517 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_518 (StreamMap __iruid_519 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_516)) (Let __iruid_520 (ToArray (StreamMap __iruid_521 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_519))) (InsertFields (SelectFields () (Ref __iruid_521)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_521))))))) (Let __iruid_526 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_522 (StreamFilter __iruid_523 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_520)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_520) (Ref __iruid_523))))) (Let __iruid_524 (GetField __gt (ArrayRef -1 (Ref __iruid_520) (Ref __iruid_522))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_524)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_524)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields () (Ref __iruid_519)) None (__AC (GetTupleElement 0 (Ref __iruid_526))) (__n_called (GetTupleElement 1 (Ref __iruid_526))))))) (Let __iruid_525 (GetField __AC (Ref __iruid_518)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_525) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_525) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_518))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_514) (Ref __iruid_515)))) 2023-04-22 21:08:57.828 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 87: (MakeTuple (0) (StreamFold __iruid_553 __iruid_554 (ToStream False (CollectDistributedArray count_per_partition __iruid_555 __iruid_556 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (Cast Int64 (StreamLen (StreamFilter __iruid_557 (StreamMap __iruid_558 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_555)) (Let __iruid_559 (ToArray (StreamMap __iruid_560 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_558))) (InsertFields (SelectFields () (Ref __iruid_560)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_560))))))) (Let __iruid_561 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_562 (StreamFilter __iruid_563 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_559)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_559) (Ref __iruid_563))))) (Let __iruid_564 (GetField __gt (ArrayRef -1 (Ref __iruid_559) (Ref __iruid_562))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_564)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_564)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields () (Ref __iruid_558)) None (__AC (GetTupleElement 0 (Ref __iruid_561))) (__n_called (GetTupleElement 1 (Ref __iruid_561))))))) (Let __iruid_565 (GetField __AC (Ref __iruid_557)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_565) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_565) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref __iruid_557))))) (False)))))) (NA String))) (I64 0) (ApplyBinaryPrimOp Add (Ref __iruid_553) (Ref __iruid_554)))) 2023-04-22 21:08:57.926 : INFO: instruction count: 3: __C284HailClassLoaderContainer. 2023-04-22 21:08:57.926 : INFO: instruction count: 3: __C284HailClassLoaderContainer. 2023-04-22 21:08:57.926 : INFO: instruction count: 3: __C286FSContainer. 2023-04-22 21:08:57.926 : INFO: instruction count: 3: __C286FSContainer. 2023-04-22 21:08:57.961 : INFO: instruction count: 3: __C288collect_distributed_array_count_per_partition. 2023-04-22 21:08:57.962 : INFO: instruction count: 114: __C288collect_distributed_array_count_per_partition.apply 2023-04-22 21:08:57.963 : INFO: instruction count: 17: __C288collect_distributed_array_count_per_partition.apply 2023-04-22 21:08:57.964 : INFO: instruction count: 27: __C288collect_distributed_array_count_per_partition.__m290DECODE_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:08:57.964 : INFO: instruction count: 44: __C288collect_distributed_array_count_per_partition.__m291INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:57.964 : INFO: instruction count: 31: __C288collect_distributed_array_count_per_partition.__m292INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:08:57.965 : INFO: instruction count: 10: __C288collect_distributed_array_count_per_partition.__m293INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:08:57.965 : INFO: instruction count: 27: __C288collect_distributed_array_count_per_partition.__m295DECODE_r_struct_of_r_struct_of_ENDEND_TO_SBaseStructPointer 2023-04-22 21:08:57.965 : INFO: instruction count: 8: __C288collect_distributed_array_count_per_partition.__m296INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:08:57.968 : INFO: instruction count: 298: __C288collect_distributed_array_count_per_partition.__m298split_StreamLen 2023-04-22 21:08:57.969 : INFO: instruction count: 205: __C288collect_distributed_array_count_per_partition.__m307split_ToArray 2023-04-22 21:08:57.969 : INFO: instruction count: 8: __C288collect_distributed_array_count_per_partition.__m315nNonRefAlleles 2023-04-22 21:08:57.970 : INFO: instruction count: 9: __C288collect_distributed_array_count_per_partition.__m327begin_group_0 2023-04-22 21:08:57.970 : INFO: instruction count: 17: __C288collect_distributed_array_count_per_partition.__m328begin_group_0 2023-04-22 21:08:57.970 : INFO: instruction count: 155: __C288collect_distributed_array_count_per_partition.__m329split_StreamFor 2023-04-22 21:08:57.970 : INFO: instruction count: 35: __C288collect_distributed_array_count_per_partition.__m337arrayref_bounds_check 2023-04-22 21:08:57.971 : INFO: instruction count: 73: __C288collect_distributed_array_count_per_partition.__m341begin_group_0 2023-04-22 21:08:57.971 : INFO: instruction count: 5: __C288collect_distributed_array_count_per_partition.__m344toInt64 2023-04-22 21:08:57.971 : INFO: instruction count: 11: __C288collect_distributed_array_count_per_partition.__m355ord_gt 2023-04-22 21:08:57.971 : INFO: instruction count: 16: __C288collect_distributed_array_count_per_partition.__m356ord_gtNonnull 2023-04-22 21:08:57.971 : INFO: instruction count: 11: __C288collect_distributed_array_count_per_partition.__m357ord_lt 2023-04-22 21:08:57.972 : INFO: instruction count: 16: __C288collect_distributed_array_count_per_partition.__m358ord_ltNonnull 2023-04-22 21:08:57.972 : INFO: instruction count: 13: __C288collect_distributed_array_count_per_partition.__m362ENCODE_SBaseStructPointer_TO_r_struct_of_r_int64END 2023-04-22 21:08:57.972 : INFO: instruction count: 4: __C288collect_distributed_array_count_per_partition.__m363ENCODE_SInt64$_TO_r_int64 2023-04-22 21:08:57.972 : INFO: instruction count: 9: __C288collect_distributed_array_count_per_partition.setPartitionIndex 2023-04-22 21:08:57.972 : INFO: instruction count: 4: __C288collect_distributed_array_count_per_partition.addPartitionRegion 2023-04-22 21:08:57.972 : INFO: instruction count: 4: __C288collect_distributed_array_count_per_partition.setPool 2023-04-22 21:08:57.972 : INFO: instruction count: 3: __C288collect_distributed_array_count_per_partition.addHailClassLoader 2023-04-22 21:08:57.972 : INFO: instruction count: 3: __C288collect_distributed_array_count_per_partition.addFS 2023-04-22 21:08:57.973 : INFO: instruction count: 4: __C288collect_distributed_array_count_per_partition.addTaskContext 2023-04-22 21:08:57.973 : INFO: instruction count: 3: __C288collect_distributed_array_count_per_partition.setObjects 2023-04-22 21:08:57.973 : INFO: instruction count: 3: __C359staticWrapperClass_1. 2023-04-22 21:08:58.029 : INFO: encoder cache hit 2023-04-22 21:08:58.030 MemoryStore: INFO: Block broadcast_3 stored as values in memory (estimated size 688.0 B, free 28.8 GiB) 2023-04-22 21:08:58.033 MemoryStore: INFO: Block broadcast_3_piece0 stored as bytes in memory (estimated size 217.0 B, free 28.8 GiB) 2023-04-22 21:08:58.036 BlockManagerInfo: INFO: Added broadcast_3_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 217.0 B, free: 28.8 GiB) 2023-04-22 21:08:58.038 SparkContext: INFO: Created broadcast 3 from broadcast at SparkBackend.scala:354 2023-04-22 21:08:58.038 : INFO: instruction count: 3: __C271HailClassLoaderContainer. 2023-04-22 21:08:58.038 : INFO: instruction count: 3: __C271HailClassLoaderContainer. 2023-04-22 21:08:58.038 : INFO: instruction count: 3: __C273FSContainer. 2023-04-22 21:08:58.039 : INFO: instruction count: 3: __C273FSContainer. 2023-04-22 21:08:58.046 : INFO: instruction count: 3: __C275Compiled. 2023-04-22 21:08:58.046 : INFO: instruction count: 25: __C275Compiled.apply 2023-04-22 21:08:58.047 : INFO: instruction count: 343: __C275Compiled.__m277split_StreamFold 2023-04-22 21:08:58.047 : INFO: instruction count: 4: __C275Compiled.setBackend 2023-04-22 21:08:58.047 : INFO: instruction count: 9: __C275Compiled.__m372ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND 2023-04-22 21:08:58.047 : INFO: instruction count: 49: __C275Compiled.__m373ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:58.048 : INFO: instruction count: 16: __C275Compiled.__m374ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:08:58.048 : INFO: instruction count: 4: __C275Compiled.__m375ENCODE_SInt32$_TO_r_int32 2023-04-22 21:08:58.048 : INFO: instruction count: 9: __C275Compiled.__m376ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_ENDEND 2023-04-22 21:08:58.048 : INFO: instruction count: 1: __C275Compiled.__m377ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:08:58.048 : INFO: instruction count: 27: __C275Compiled.__m380DECODE_r_struct_of_r_int64END_TO_SBaseStructPointer 2023-04-22 21:08:58.048 : INFO: instruction count: 10: __C275Compiled.__m381INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:08:58.048 : INFO: instruction count: 9: __C275Compiled.setPartitionIndex 2023-04-22 21:08:58.049 : INFO: instruction count: 4: __C275Compiled.addPartitionRegion 2023-04-22 21:08:58.049 : INFO: instruction count: 4: __C275Compiled.setPool 2023-04-22 21:08:58.049 : INFO: instruction count: 3: __C275Compiled.addHailClassLoader 2023-04-22 21:08:58.049 : INFO: instruction count: 3: __C275Compiled.addFS 2023-04-22 21:08:58.049 : INFO: instruction count: 4: __C275Compiled.addTaskContext 2023-04-22 21:08:58.049 : INFO: instruction count: 3: __C275Compiled.setObjects 2023-04-22 21:08:58.049 : INFO: instruction count: 64: __C275Compiled.addAndDecodeLiterals 2023-04-22 21:08:58.049 : INFO: instruction count: 36: __C275Compiled.__m394DECODE_r_struct_of_r_struct_of_ENDANDr_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:08:58.050 : INFO: instruction count: 8: __C275Compiled.__m395INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:08:58.050 : INFO: instruction count: 58: __C275Compiled.__m396INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:58.050 : INFO: instruction count: 44: __C275Compiled.__m397INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:08:58.050 : INFO: instruction count: 31: __C275Compiled.__m398INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:08:58.050 : INFO: instruction count: 10: __C275Compiled.__m399INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:08:58.051 : INFO: instruction count: 3: __C378staticWrapperClass_1. 2023-04-22 21:08:58.090 : INFO: executing D-Array [count_per_partition] with 8 tasks 2023-04-22 21:08:58.091 MemoryStore: INFO: Block broadcast_4 stored as values in memory (estimated size 64.0 B, free 28.8 GiB) 2023-04-22 21:08:58.099 MemoryStore: INFO: Block broadcast_4_piece0 stored as bytes in memory (estimated size 49.0 B, free 28.8 GiB) 2023-04-22 21:08:58.099 BlockManagerInfo: INFO: Added broadcast_4_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 49.0 B, free: 28.8 GiB) 2023-04-22 21:08:58.100 SparkContext: INFO: Created broadcast 4 from broadcast at SparkBackend.scala:354 2023-04-22 21:08:58.108 MemoryStore: INFO: Block broadcast_5 stored as values in memory (estimated size 429.5 KiB, free 28.8 GiB) 2023-04-22 21:08:58.225 MemoryStore: INFO: Block broadcast_5_piece0 stored as bytes in memory (estimated size 32.4 KiB, free 28.8 GiB) 2023-04-22 21:08:58.226 BlockManagerInfo: INFO: Added broadcast_5_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 32.4 KiB, free: 28.8 GiB) 2023-04-22 21:08:58.229 SparkContext: INFO: Created broadcast 5 from broadcast at SparkBackend.scala:354 2023-04-22 21:08:58.910 BlockManagerInfo: INFO: Removed broadcast_1_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 217.0 B, free: 28.8 GiB) 2023-04-22 21:08:59.530 SparkContext: INFO: Starting job: collect at SparkBackend.scala:368 2023-04-22 21:08:59.555 DAGScheduler: INFO: Got job 0 (collect at SparkBackend.scala:368) with 8 output partitions 2023-04-22 21:08:59.556 DAGScheduler: INFO: Final stage: ResultStage 0 (collect at SparkBackend.scala:368) 2023-04-22 21:08:59.556 DAGScheduler: INFO: Parents of final stage: List() 2023-04-22 21:08:59.557 DAGScheduler: INFO: Missing parents: List() 2023-04-22 21:08:59.575 DAGScheduler: INFO: Submitting ResultStage 0 (SparkBackendComputeRDD[0] at RDD at SparkBackend.scala:784), which has no missing parents 2023-04-22 21:08:59.760 MemoryStore: INFO: Block broadcast_6 stored as values in memory (estimated size 752.9 KiB, free 28.8 GiB) 2023-04-22 21:08:59.811 MemoryStore: INFO: Block broadcast_6_piece0 stored as bytes in memory (estimated size 378.5 KiB, free 28.8 GiB) 2023-04-22 21:08:59.812 BlockManagerInfo: INFO: Added broadcast_6_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 378.5 KiB, free: 28.8 GiB) 2023-04-22 21:08:59.814 SparkContext: INFO: Created broadcast 6 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:08:59.856 DAGScheduler: INFO: Submitting 8 missing tasks from ResultStage 0 (SparkBackendComputeRDD[0] at RDD at SparkBackend.scala:784) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:08:59.858 TaskSchedulerImpl: INFO: Adding task set 0.0 with 8 tasks resource profile 0 2023-04-22 21:08:59.931 TaskSetManager: INFO: Starting task 0.0 in stage 0.0 (TID 0) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:08:59.997 Executor: INFO: Running task 0.0 in stage 0.0 (TID 0) 2023-04-22 21:09:00.743 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 0.0 in stage 0.0 (TID 0) 2023-04-22 21:09:00.883 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 0.0 in stage 0.0 (TID 0) 2023-04-22 21:09:07.129 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=327680, peakBytesReadable=320.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:09:07.129 : INFO: RegionPool: FREE: 320.0K allocated (256.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 0.0 in stage 0.0 (TID 0) 2023-04-22 21:09:07.147 Executor: INFO: Finished task 0.0 in stage 0.0 (TID 0). 826 bytes result sent to driver 2023-04-22 21:09:07.153 TaskSetManager: INFO: Starting task 1.0 in stage 0.0 (TID 1) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:09:07.165 Executor: INFO: Running task 1.0 in stage 0.0 (TID 1) 2023-04-22 21:09:07.193 TaskSetManager: INFO: Finished task 0.0 in stage 0.0 (TID 0) in 7274 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:09:07.307 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 1.0 in stage 0.0 (TID 1) 2023-04-22 21:09:07.324 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 1.0 in stage 0.0 (TID 1) 2023-04-22 21:09:12.767 : INFO: TaskReport: stage=0, partition=1, attempt=0, peakBytes=327680, peakBytesReadable=320.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:09:12.768 : INFO: RegionPool: FREE: 320.0K allocated (256.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 1.0 in stage 0.0 (TID 1) 2023-04-22 21:09:12.769 Executor: INFO: Finished task 1.0 in stage 0.0 (TID 1). 826 bytes result sent to driver 2023-04-22 21:09:12.773 TaskSetManager: INFO: Starting task 2.0 in stage 0.0 (TID 2) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:09:12.774 Executor: INFO: Running task 2.0 in stage 0.0 (TID 2) 2023-04-22 21:09:12.892 TaskSetManager: INFO: Finished task 1.0 in stage 0.0 (TID 1) in 5740 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:09:12.924 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 2.0 in stage 0.0 (TID 2) 2023-04-22 21:09:12.940 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 2.0 in stage 0.0 (TID 2) 2023-04-22 21:09:18.422 : INFO: TaskReport: stage=0, partition=2, attempt=0, peakBytes=327680, peakBytesReadable=320.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:09:18.422 : INFO: RegionPool: FREE: 320.0K allocated (256.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 2.0 in stage 0.0 (TID 2) 2023-04-22 21:09:18.433 Executor: INFO: Finished task 2.0 in stage 0.0 (TID 2). 869 bytes result sent to driver 2023-04-22 21:09:18.434 TaskSetManager: INFO: Starting task 3.0 in stage 0.0 (TID 3) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:09:18.439 Executor: INFO: Running task 3.0 in stage 0.0 (TID 3) 2023-04-22 21:09:18.479 TaskSetManager: INFO: Finished task 2.0 in stage 0.0 (TID 2) in 5706 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:09:18.507 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 3.0 in stage 0.0 (TID 3) 2023-04-22 21:09:18.541 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 3.0 in stage 0.0 (TID 3) 2023-04-22 21:09:23.924 : INFO: TaskReport: stage=0, partition=3, attempt=0, peakBytes=327680, peakBytesReadable=320.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:09:23.925 : INFO: RegionPool: FREE: 320.0K allocated (256.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 3.0 in stage 0.0 (TID 3) 2023-04-22 21:09:23.926 Executor: INFO: Finished task 3.0 in stage 0.0 (TID 3). 826 bytes result sent to driver 2023-04-22 21:09:23.928 TaskSetManager: INFO: Starting task 4.0 in stage 0.0 (TID 4) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:09:23.929 Executor: INFO: Running task 4.0 in stage 0.0 (TID 4) 2023-04-22 21:09:23.966 TaskSetManager: INFO: Finished task 3.0 in stage 0.0 (TID 3) in 5532 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:09:23.999 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 4.0 in stage 0.0 (TID 4) 2023-04-22 21:09:24.015 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 4.0 in stage 0.0 (TID 4) 2023-04-22 21:09:29.246 : INFO: TaskReport: stage=0, partition=4, attempt=0, peakBytes=327680, peakBytesReadable=320.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:09:29.247 : INFO: RegionPool: FREE: 320.0K allocated (256.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 4.0 in stage 0.0 (TID 4) 2023-04-22 21:09:29.248 Executor: INFO: Finished task 4.0 in stage 0.0 (TID 4). 826 bytes result sent to driver 2023-04-22 21:09:29.249 TaskSetManager: INFO: Starting task 5.0 in stage 0.0 (TID 5) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:09:29.260 TaskSetManager: INFO: Finished task 4.0 in stage 0.0 (TID 4) in 5332 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:09:29.267 Executor: INFO: Running task 5.0 in stage 0.0 (TID 5) 2023-04-22 21:09:29.312 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 5.0 in stage 0.0 (TID 5) 2023-04-22 21:09:29.331 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 5.0 in stage 0.0 (TID 5) 2023-04-22 21:09:34.547 : INFO: TaskReport: stage=0, partition=5, attempt=0, peakBytes=327680, peakBytesReadable=320.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:09:34.547 : INFO: RegionPool: FREE: 320.0K allocated (256.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 5.0 in stage 0.0 (TID 5) 2023-04-22 21:09:34.552 Executor: INFO: Finished task 5.0 in stage 0.0 (TID 5). 826 bytes result sent to driver 2023-04-22 21:09:34.553 TaskSetManager: INFO: Starting task 6.0 in stage 0.0 (TID 6) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:09:34.554 TaskSetManager: INFO: Finished task 5.0 in stage 0.0 (TID 5) in 5305 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:09:34.557 Executor: INFO: Running task 6.0 in stage 0.0 (TID 6) 2023-04-22 21:09:34.693 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 6.0 in stage 0.0 (TID 6) 2023-04-22 21:09:34.708 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 6.0 in stage 0.0 (TID 6) 2023-04-22 21:09:39.981 : INFO: TaskReport: stage=0, partition=6, attempt=0, peakBytes=327680, peakBytesReadable=320.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:09:39.981 : INFO: RegionPool: FREE: 320.0K allocated (256.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 6.0 in stage 0.0 (TID 6) 2023-04-22 21:09:39.984 Executor: INFO: Finished task 6.0 in stage 0.0 (TID 6). 869 bytes result sent to driver 2023-04-22 21:09:39.985 TaskSetManager: INFO: Starting task 7.0 in stage 0.0 (TID 7) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:09:39.985 TaskSetManager: INFO: Finished task 6.0 in stage 0.0 (TID 6) in 5432 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:09:40.005 Executor: INFO: Running task 7.0 in stage 0.0 (TID 7) 2023-04-22 21:09:40.053 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 7.0 in stage 0.0 (TID 7) 2023-04-22 21:09:40.079 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 7.0 in stage 0.0 (TID 7) 2023-04-22 21:09:45.319 : INFO: TaskReport: stage=0, partition=7, attempt=0, peakBytes=327680, peakBytesReadable=320.00 KiB, chunks requested=14323, cache hits=14322 2023-04-22 21:09:45.320 : INFO: RegionPool: FREE: 320.0K allocated (256.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 7.0 in stage 0.0 (TID 7) 2023-04-22 21:09:45.323 Executor: INFO: Finished task 7.0 in stage 0.0 (TID 7). 826 bytes result sent to driver 2023-04-22 21:09:45.327 TaskSetManager: INFO: Finished task 7.0 in stage 0.0 (TID 7) in 5343 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:09:45.328 TaskSchedulerImpl: INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool 2023-04-22 21:09:45.329 DAGScheduler: INFO: ResultStage 0 (collect at SparkBackend.scala:368) finished in 45.710 s 2023-04-22 21:09:45.333 DAGScheduler: INFO: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:09:45.333 TaskSchedulerImpl: INFO: Killing all running tasks in stage 0: Stage finished 2023-04-22 21:09:45.348 DAGScheduler: INFO: Job 0 finished: collect at SparkBackend.scala:368, took 45.817516 s 2023-04-22 21:09:45.437 : INFO: executed D-Array [count_per_partition] in 47.347s 2023-04-22 21:09:45.453 : INFO: took 52.820s 2023-04-22 21:09:45.453 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (I64 114591) 2023-04-22 21:09:45.454 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (I64 114591) 2023-04-22 21:09:45.454 : INFO: after EvalRelationalLets: IR size 1: (I64 114591) 2023-04-22 21:09:45.454 : INFO: after LowerAndExecuteShuffles: IR size 1: (I64 114591) 2023-04-22 21:09:45.454 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (I64 114591) 2023-04-22 21:09:45.454 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (I64 114591) 2023-04-22 21:09:45.455 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (I64 114591) 2023-04-22 21:09:45.455 : INFO: initial IR: IR size 2: (MakeTuple (0) (I64 114591)) 2023-04-22 21:09:45.456 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.456 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.457 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.457 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.457 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.469 : INFO: encoder cache miss (2 hits, 2 misses, 0.500) 2023-04-22 21:09:45.471 : INFO: instruction count: 3: __C413HailClassLoaderContainer. 2023-04-22 21:09:45.471 : INFO: instruction count: 3: __C413HailClassLoaderContainer. 2023-04-22 21:09:45.471 : INFO: instruction count: 3: __C415FSContainer. 2023-04-22 21:09:45.472 : INFO: instruction count: 3: __C415FSContainer. 2023-04-22 21:09:45.482 : INFO: instruction count: 3: __C417etypeEncode. 2023-04-22 21:09:45.482 : INFO: instruction count: 7: __C417etypeEncode.apply 2023-04-22 21:09:45.482 : INFO: instruction count: 9: __C417etypeEncode.__m419ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_int64ENDEND 2023-04-22 21:09:45.482 : INFO: instruction count: 13: __C417etypeEncode.__m420ENCODE_SBaseStructPointer_TO_r_struct_of_r_int64END 2023-04-22 21:09:45.483 : INFO: instruction count: 4: __C417etypeEncode.__m421ENCODE_SInt64$_TO_r_int64 2023-04-22 21:09:45.485 MemoryStore: INFO: Block broadcast_7 stored as values in memory (estimated size 104.0 B, free 28.8 GiB) 2023-04-22 21:09:45.501 MemoryStore: INFO: Block broadcast_7_piece0 stored as bytes in memory (estimated size 65.0 B, free 28.8 GiB) 2023-04-22 21:09:45.502 BlockManagerInfo: INFO: Added broadcast_7_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 65.0 B, free: 28.8 GiB) 2023-04-22 21:09:45.502 SparkContext: INFO: Created broadcast 7 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:45.503 : INFO: instruction count: 3: __C400HailClassLoaderContainer. 2023-04-22 21:09:45.503 : INFO: instruction count: 3: __C400HailClassLoaderContainer. 2023-04-22 21:09:45.503 : INFO: instruction count: 3: __C402FSContainer. 2023-04-22 21:09:45.503 : INFO: instruction count: 3: __C402FSContainer. 2023-04-22 21:09:45.505 : INFO: instruction count: 3: __C404Compiled. 2023-04-22 21:09:45.505 : INFO: instruction count: 7: __C404Compiled.apply 2023-04-22 21:09:45.505 : INFO: instruction count: 9: __C404Compiled.setPartitionIndex 2023-04-22 21:09:45.505 : INFO: instruction count: 4: __C404Compiled.addPartitionRegion 2023-04-22 21:09:45.505 : INFO: instruction count: 4: __C404Compiled.setPool 2023-04-22 21:09:45.505 : INFO: instruction count: 3: __C404Compiled.addHailClassLoader 2023-04-22 21:09:45.505 : INFO: instruction count: 3: __C404Compiled.addFS 2023-04-22 21:09:45.505 : INFO: instruction count: 4: __C404Compiled.addTaskContext 2023-04-22 21:09:45.506 : INFO: instruction count: 41: __C404Compiled.addAndDecodeLiterals 2023-04-22 21:09:45.506 : INFO: instruction count: 27: __C404Compiled.__m410DECODE_r_struct_of_r_struct_of_r_int64ENDEND_TO_SBaseStructPointer 2023-04-22 21:09:45.506 : INFO: instruction count: 17: __C404Compiled.__m411INPLACE_DECODE_r_struct_of_r_int64END_TO_r_tuple_of_r_int64END 2023-04-22 21:09:45.506 : INFO: instruction count: 10: __C404Compiled.__m412INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:09:45.506 : INFO: initial IR: IR size 2: (MakeTuple (0) (I64 114591)) 2023-04-22 21:09:45.507 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.507 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.507 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.507 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.508 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.542 : INFO: encoder cache hit 2023-04-22 21:09:45.543 MemoryStore: INFO: Block broadcast_8 stored as values in memory (estimated size 104.0 B, free 28.8 GiB) 2023-04-22 21:09:45.544 MemoryStore: INFO: Block broadcast_8_piece0 stored as bytes in memory (estimated size 65.0 B, free 28.8 GiB) 2023-04-22 21:09:45.554 BlockManagerInfo: INFO: Added broadcast_8_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 65.0 B, free: 28.8 GiB) 2023-04-22 21:09:45.555 SparkContext: INFO: Created broadcast 8 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:45.555 : INFO: instruction count: 3: __C422HailClassLoaderContainer. 2023-04-22 21:09:45.555 : INFO: instruction count: 3: __C422HailClassLoaderContainer. 2023-04-22 21:09:45.556 : INFO: instruction count: 3: __C424FSContainer. 2023-04-22 21:09:45.556 : INFO: instruction count: 3: __C424FSContainer. 2023-04-22 21:09:45.557 : INFO: instruction count: 3: __C426Compiled. 2023-04-22 21:09:45.557 : INFO: instruction count: 7: __C426Compiled.apply 2023-04-22 21:09:45.557 : INFO: instruction count: 9: __C426Compiled.setPartitionIndex 2023-04-22 21:09:45.557 : INFO: instruction count: 4: __C426Compiled.addPartitionRegion 2023-04-22 21:09:45.558 : INFO: instruction count: 4: __C426Compiled.setPool 2023-04-22 21:09:45.558 : INFO: instruction count: 3: __C426Compiled.addHailClassLoader 2023-04-22 21:09:45.558 : INFO: instruction count: 3: __C426Compiled.addFS 2023-04-22 21:09:45.558 : INFO: instruction count: 4: __C426Compiled.addTaskContext 2023-04-22 21:09:45.558 : INFO: instruction count: 41: __C426Compiled.addAndDecodeLiterals 2023-04-22 21:09:45.558 : INFO: instruction count: 27: __C426Compiled.__m432DECODE_r_struct_of_r_struct_of_r_int64ENDEND_TO_SBaseStructPointer 2023-04-22 21:09:45.558 : INFO: instruction count: 17: __C426Compiled.__m433INPLACE_DECODE_r_struct_of_r_int64END_TO_r_tuple_of_r_int64END 2023-04-22 21:09:45.558 : INFO: instruction count: 10: __C426Compiled.__m434INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:09:45.559 : INFO: initial IR: IR size 2: (MakeTuple (0) (I64 114591)) 2023-04-22 21:09:45.559 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.559 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.559 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.560 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.560 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:09:45.579 : INFO: encoder cache hit 2023-04-22 21:09:45.580 MemoryStore: INFO: Block broadcast_9 stored as values in memory (estimated size 104.0 B, free 28.8 GiB) 2023-04-22 21:09:45.582 MemoryStore: INFO: Block broadcast_9_piece0 stored as bytes in memory (estimated size 65.0 B, free 28.8 GiB) 2023-04-22 21:09:45.583 BlockManagerInfo: INFO: Added broadcast_9_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 65.0 B, free: 28.8 GiB) 2023-04-22 21:09:45.583 SparkContext: INFO: Created broadcast 9 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:45.584 : INFO: instruction count: 3: __C435HailClassLoaderContainer. 2023-04-22 21:09:45.584 : INFO: instruction count: 3: __C435HailClassLoaderContainer. 2023-04-22 21:09:45.584 : INFO: instruction count: 3: __C437FSContainer. 2023-04-22 21:09:45.584 : INFO: instruction count: 3: __C437FSContainer. 2023-04-22 21:09:45.585 : INFO: instruction count: 3: __C439Compiled. 2023-04-22 21:09:45.586 : INFO: instruction count: 7: __C439Compiled.apply 2023-04-22 21:09:45.586 : INFO: instruction count: 9: __C439Compiled.setPartitionIndex 2023-04-22 21:09:45.586 : INFO: instruction count: 4: __C439Compiled.addPartitionRegion 2023-04-22 21:09:45.586 : INFO: instruction count: 4: __C439Compiled.setPool 2023-04-22 21:09:45.586 : INFO: instruction count: 3: __C439Compiled.addHailClassLoader 2023-04-22 21:09:45.586 : INFO: instruction count: 3: __C439Compiled.addFS 2023-04-22 21:09:45.586 : INFO: instruction count: 4: __C439Compiled.addTaskContext 2023-04-22 21:09:45.586 : INFO: instruction count: 41: __C439Compiled.addAndDecodeLiterals 2023-04-22 21:09:45.586 : INFO: instruction count: 27: __C439Compiled.__m445DECODE_r_struct_of_r_struct_of_r_int64ENDEND_TO_SBaseStructPointer 2023-04-22 21:09:45.586 : INFO: instruction count: 17: __C439Compiled.__m446INPLACE_DECODE_r_struct_of_r_int64END_TO_r_tuple_of_r_int64END 2023-04-22 21:09:45.587 : INFO: instruction count: 10: __C439Compiled.__m447INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:09:45.607 : INFO: encoder cache miss (4 hits, 3 misses, 0.571) 2023-04-22 21:09:45.608 : INFO: instruction count: 3: __C448HailClassLoaderContainer. 2023-04-22 21:09:45.608 : INFO: instruction count: 3: __C448HailClassLoaderContainer. 2023-04-22 21:09:45.608 : INFO: instruction count: 3: __C450FSContainer. 2023-04-22 21:09:45.608 : INFO: instruction count: 3: __C450FSContainer. 2023-04-22 21:09:45.609 : INFO: instruction count: 3: __C452etypeEncode. 2023-04-22 21:09:45.631 : INFO: instruction count: 11: __C452etypeEncode.apply 2023-04-22 21:09:45.631 : INFO: instruction count: 4: __C452etypeEncode.__m454ENCODE_SInt64$_TO_o_int64 2023-04-22 21:09:45.633 : INFO: finished execution of query hail_query_1, result size is 8.00 B 2023-04-22 21:09:45.633 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:09:45.652 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:09:45.652 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:09:45.652 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:09:45.652 : INFO: timing SparkBackend.executeEncode total 54.511s self 517.297ms children 53.994s %children 99.05% 2023-04-22 21:09:45.653 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR total 381.294ms self 1.101ms children 380.193ms %children 99.71% 2023-04-22 21:09:45.653 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.983ms self 0.983ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.653 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation total 378.874ms self 2.257ms children 376.617ms %children 99.40% 2023-04-22 21:09:45.653 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 376.617ms self 14.390ms children 362.227ms %children 96.18% 2023-04-22 21:09:45.653 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 26.529ms self 26.529ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.653 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 11.213ms self 11.213ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.653 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 10.852ms self 10.852ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.654 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 61.168ms self 61.168ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.654 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 43.930ms self 43.930ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.654 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 2.917ms self 2.917ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.654 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 67.313ms self 67.313ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.654 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.760ms self 0.760ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.654 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 1.213ms self 1.213ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.654 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 2.473ms self 2.473ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.654 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.573ms self 1.573ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.655 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 29.518ms self 29.518ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.655 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.490ms self 0.490ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.655 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 29.640ms self 29.640ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.655 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.683ms self 0.683ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.655 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.791ms self 0.791ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.655 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 3.567ms self 3.567ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.655 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 4.980ms self 4.980ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.656 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 56.826ms self 56.826ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.656 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.525ms self 0.525ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.656 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 5.264ms self 5.264ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.656 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.336ms self 0.336ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.656 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable total 90.095ms self 0.020ms children 90.075ms %children 99.98% 2023-04-22 21:09:45.656 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.123ms self 0.123ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.656 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/LoweringTransformation total 88.219ms self 88.219ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.656 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 1.733ms self 1.733ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.657 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable total 298.297ms self 0.016ms children 298.281ms %children 99.99% 2023-04-22 21:09:45.657 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.293ms self 0.293ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.657 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 297.705ms self 0.030ms children 297.676ms %children 99.99% 2023-04-22 21:09:45.657 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 297.676ms self 0.187ms children 297.489ms %children 99.94% 2023-04-22 21:09:45.657 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 2.155ms self 2.155ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.657 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 1.567ms self 1.567ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.657 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 29.022ms self 29.022ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.657 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 94.738ms self 94.738ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.658 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 20.682ms self 20.682ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.658 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.852ms self 0.852ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.658 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 24.542ms self 24.542ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.658 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 1.080ms self 1.080ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.658 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 28.756ms self 28.756ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.658 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 2.801ms self 2.801ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.658 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 34.022ms self 34.022ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.659 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 22.319ms self 22.319ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.659 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.526ms self 0.526ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.659 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 8.660ms self 8.660ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.659 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.697ms self 0.697ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.659 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.493ms self 0.493ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.659 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 2.426ms self 2.426ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.659 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 2.088ms self 2.088ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.659 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 13.437ms self 13.437ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.660 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.536ms self 0.536ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.660 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 6.090ms self 6.090ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.660 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.282ms self 0.282ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.660 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets total 4.135ms self 0.020ms children 4.115ms %children 99.52% 2023-04-22 21:09:45.660 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.238ms self 0.238ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.660 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/LoweringTransformation total 3.576ms self 3.576ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.660 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.301ms self 0.301ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.660 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets total 53.084s self 0.025ms children 53.084s %children 100.00% 2023-04-22 21:09:45.661 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.266ms self 0.266ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.661 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation total 53.083s self 50.762ms children 53.033s %children 99.90% 2023-04-22 21:09:45.663 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles total 39.380ms self 0.020ms children 39.360ms %children 99.95% 2023-04-22 21:09:45.663 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 36.504ms self 36.504ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.664 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 2.543ms self 2.543ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.664 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.313ms self 0.313ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.664 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 148.000ms self 0.016ms children 147.984ms %children 99.99% 2023-04-22 21:09:45.664 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.178ms self 0.178ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.664 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 147.523ms self 0.026ms children 147.497ms %children 99.98% 2023-04-22 21:09:45.664 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 147.497ms self 0.265ms children 147.232ms %children 99.82% 2023-04-22 21:09:45.664 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.600ms self 0.600ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.665 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.579ms self 0.579ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.665 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.873ms self 1.873ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.665 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 37.091ms self 37.091ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.665 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 21.548ms self 21.548ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.665 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.486ms self 0.486ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.665 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 11.824ms self 11.824ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.665 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.832ms self 0.832ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.665 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.483ms self 0.483ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.666 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.668ms self 1.668ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.666 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 1.832ms self 1.832ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.666 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 22.069ms self 22.069ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.666 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.418ms self 0.418ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.666 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 7.102ms self 7.102ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.666 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.823ms self 0.823ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.666 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.536ms self 0.536ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.666 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.705ms self 1.705ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.667 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 1.799ms self 1.799ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.667 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 17.492ms self 17.492ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.667 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.401ms self 0.401ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.667 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 16.071ms self 16.071ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.667 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.282ms self 0.282ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.667 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable total 52.845s self 0.025ms children 52.845s %children 100.00% 2023-04-22 21:09:45.667 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.210ms self 0.210ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.667 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 52.845s self 2.369s children 50.476s %children 95.52% 2023-04-22 21:09:45.668 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR total 234.681ms self 0.019ms children 234.662ms %children 99.99% 2023-04-22 21:09:45.668 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.259ms self 0.259ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.668 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation total 234.060ms self 0.037ms children 234.022ms %children 99.98% 2023-04-22 21:09:45.668 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 234.022ms self 0.208ms children 233.814ms %children 99.91% 2023-04-22 21:09:45.668 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.936ms self 0.936ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.668 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 24.628ms self 24.628ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.668 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 6.033ms self 6.033ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.668 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 22.169ms self 22.169ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.669 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 28.441ms self 28.441ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.669 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 2.294ms self 2.294ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.669 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 22.423ms self 22.423ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.669 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.945ms self 0.945ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.669 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.451ms self 0.451ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.669 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.273ms self 1.273ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.669 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.669ms self 1.669ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.669 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 29.129ms self 29.129ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.670 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.394ms self 0.394ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.670 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 5.347ms self 5.347ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.670 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 52.585ms self 52.585ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.670 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.450ms self 0.450ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.670 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.253ms self 1.253ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.670 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.655ms self 1.655ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.670 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 14.515ms self 14.515ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.670 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.377ms self 0.377ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.671 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 16.847ms self 16.847ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.671 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.343ms self 0.343ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.671 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable total 0.890ms self 0.029ms children 0.861ms %children 96.72% 2023-04-22 21:09:45.671 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.260ms self 0.260ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.671 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/LoweringTransformation total 0.329ms self 0.329ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.671 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.272ms self 0.272ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.671 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable total 106.799ms self 0.030ms children 106.770ms %children 99.97% 2023-04-22 21:09:45.671 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.229ms self 0.229ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.672 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 106.214ms self 0.035ms children 106.179ms %children 99.97% 2023-04-22 21:09:45.672 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 106.179ms self 0.179ms children 105.999ms %children 99.83% 2023-04-22 21:09:45.694 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.782ms self 0.782ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.694 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.444ms self 0.444ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.694 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.329ms self 1.329ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.694 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 1.534ms self 1.534ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.694 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 9.752ms self 9.752ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.695 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.350ms self 0.350ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.695 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 13.125ms self 13.125ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.695 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.694ms self 0.694ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.695 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.757ms self 0.757ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.695 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.031ms self 1.031ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.695 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 9.845ms self 9.845ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.695 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 8.009ms self 8.009ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.695 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 11.510ms self 11.510ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.696 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 4.202ms self 4.202ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.696 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.687ms self 0.687ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.696 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.429ms self 0.429ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.696 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.001ms self 1.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.696 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 1.508ms self 1.508ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.696 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 28.718ms self 28.718ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.696 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 2.329ms self 2.329ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.696 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 7.964ms self 7.964ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.696 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.327ms self 0.327ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.697 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets total 0.760ms self 0.015ms children 0.745ms %children 98.00% 2023-04-22 21:09:45.697 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.258ms self 0.258ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.697 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.238ms self 0.238ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.697 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.248ms self 0.248ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.697 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets total 0.695ms self 0.013ms children 0.682ms %children 98.10% 2023-04-22 21:09:45.697 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.250ms self 0.250ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.697 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/LoweringTransformation total 0.128ms self 0.128ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.697 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.304ms self 0.304ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.697 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles total 1.047ms self 0.014ms children 1.034ms %children 98.68% 2023-04-22 21:09:45.697 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.309ms self 0.309ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.698 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.399ms self 0.399ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.698 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.325ms self 0.325ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.698 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 93.922ms self 0.015ms children 93.907ms %children 99.98% 2023-04-22 21:09:45.698 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.220ms self 0.220ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.698 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 93.487ms self 0.032ms children 93.455ms %children 99.97% 2023-04-22 21:09:45.698 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 93.455ms self 0.165ms children 93.290ms %children 99.82% 2023-04-22 21:09:45.698 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.524ms self 0.524ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.698 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.454ms self 0.454ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.699 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.108ms self 1.108ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.699 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 1.506ms self 1.506ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.699 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 10.359ms self 10.359ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.699 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.349ms self 0.349ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.699 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 6.555ms self 6.555ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.699 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.618ms self 0.618ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.699 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.395ms self 0.395ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.699 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.992ms self 0.992ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.699 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 17.288ms self 17.288ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.699 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 17.644ms self 17.644ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.700 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.358ms self 0.358ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.700 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 3.947ms self 3.947ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.700 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.523ms self 0.523ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.700 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.393ms self 0.393ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.700 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.919ms self 0.919ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.700 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.951ms self 0.951ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.700 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 24.296ms self 24.296ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.700 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.309ms self 0.309ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.700 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 3.802ms self 3.802ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.701 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.200ms self 0.200ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.701 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable total 1.915ms self 0.015ms children 1.899ms %children 99.20% 2023-04-22 21:09:45.701 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.114ms self 0.114ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.701 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 1.215ms self 1.215ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.701 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.570ms self 0.570ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.701 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 69.406ms self 0.014ms children 69.391ms %children 99.98% 2023-04-22 21:09:45.701 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.052ms self 0.052ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.701 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 69.156ms self 0.030ms children 69.126ms %children 99.96% 2023-04-22 21:09:45.701 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 69.126ms self 0.153ms children 68.973ms %children 99.78% 2023-04-22 21:09:45.702 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.342ms self 0.342ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.702 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.387ms self 0.387ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.702 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 5.018ms self 5.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.702 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.960ms self 0.960ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.702 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 3.791ms self 3.791ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.702 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.281ms self 0.281ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.702 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 3.724ms self 3.724ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.702 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.450ms self 0.450ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.702 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.399ms self 0.399ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.703 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.885ms self 0.885ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.703 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.761ms self 0.761ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 28.323ms self 28.323ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.301ms self 0.301ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 3.456ms self 3.456ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.432ms self 0.432ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.351ms self 0.351ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.891ms self 0.891ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.778ms self 0.778ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 3.124ms self 3.124ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.276ms self 0.276ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 14.042ms self 14.042ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.184ms self 0.184ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile total 2.580s self 1.753s children 826.314ms %children 32.03% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 78.610ms self 0.016ms children 78.594ms %children 99.98% 2023-04-22 21:09:45.714 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.058ms self 0.058ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 78.159ms self 0.070ms children 78.089ms %children 99.91% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 78.089ms self 0.194ms children 77.895ms %children 99.75% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.792ms self 0.792ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.375ms self 0.375ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 25.522ms self 25.522ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.874ms self 0.874ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 5.517ms self 5.517ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.221ms self 0.221ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 5.736ms self 5.736ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.405ms self 0.405ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.333ms self 0.333ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.963ms self 0.963ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.787ms self 0.787ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 3.238ms self 3.238ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.226ms self 0.226ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.715 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 3.226ms self 3.226ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.402ms self 0.402ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.351ms self 0.351ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.925ms self 0.925ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.786ms self 0.786ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 15.588ms self 15.588ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.292ms self 0.292ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 11.335ms self 11.335ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.377ms self 0.377ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 1.384ms self 0.022ms children 1.362ms %children 98.42% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.472ms self 0.472ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.557ms self 0.557ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.333ms self 0.333ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 61.388ms self 0.016ms children 61.373ms %children 99.97% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 61.238ms self 0.049ms children 61.189ms %children 99.92% 2023-04-22 21:09:45.716 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 61.189ms self 0.179ms children 61.009ms %children 99.71% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.250ms self 0.250ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.332ms self 0.332ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.899ms self 0.899ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.793ms self 0.793ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 2.989ms self 2.989ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.204ms self 0.204ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 21.015ms self 21.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.430ms self 0.430ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.330ms self 0.330ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.857ms self 0.857ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.786ms self 0.786ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 5.427ms self 5.427ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.218ms self 0.218ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 8.153ms self 8.153ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.401ms self 0.401ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.717 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.330ms self 0.330ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.849ms self 0.849ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.773ms self 0.773ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 12.751ms self 12.751ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.221ms self 0.221ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 3.001ms self 3.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.108ms self 0.108ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 81.133ms self 0.018ms children 81.115ms %children 99.98% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.115ms self 0.115ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 80.564ms self 80.564ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.436ms self 0.436ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 124.650ms self 0.014ms children 124.636ms %children 99.99% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 124.474ms self 0.048ms children 124.426ms %children 99.96% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 124.426ms self 0.190ms children 124.236ms %children 99.85% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 13.214ms self 13.214ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.718 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.798ms self 0.798ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 2.752ms self 2.752ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.654ms self 1.654ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 5.018ms self 5.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.396ms self 0.396ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 24.428ms self 24.428ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.501ms self 0.501ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.357ms self 0.357ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.945ms self 0.945ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.908ms self 0.908ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 3.546ms self 3.546ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.231ms self 0.231ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 46.941ms self 46.941ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.489ms self 0.489ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.371ms self 0.371ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.928ms self 0.928ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.916ms self 0.916ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.719 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 16.553ms self 16.553ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.261ms self 0.261ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 3.032ms self 3.032ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.127ms self 0.127ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 42.265ms self 42.265ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 52.824ms self 0.017ms children 52.807ms %children 99.97% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 52.702ms self 0.050ms children 52.652ms %children 99.90% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 52.652ms self 0.183ms children 52.468ms %children 99.65% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.269ms self 0.269ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.473ms self 0.473ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.399ms self 1.399ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.992ms self 0.992ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 14.624ms self 14.624ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.202ms self 0.202ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 3.029ms self 3.029ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.720 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.345ms self 0.345ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.329ms self 0.329ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.719ms self 0.719ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.749ms self 0.749ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 19.814ms self 19.814ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.264ms self 0.264ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 2.425ms self 2.425ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.296ms self 0.296ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.305ms self 0.305ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.738ms self 0.738ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.742ms self 0.742ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 2.329ms self 2.329ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 2.289ms self 2.289ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.379ms self 0.012ms children 0.366ms %children 96.80% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.096ms self 0.096ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.721 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.722 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.134ms self 0.134ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.722 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 57.637ms self 0.012ms children 57.625ms %children 99.98% 2023-04-22 21:09:45.722 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.722 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 57.525ms self 0.046ms children 57.479ms %children 99.92% 2023-04-22 21:09:45.737 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 57.479ms self 0.157ms children 57.322ms %children 99.73% 2023-04-22 21:09:45.737 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.197ms self 0.197ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.737 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.301ms self 0.301ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.737 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.748ms self 0.748ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.737 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.732ms self 0.732ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.737 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 14.326ms self 14.326ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.737 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.148ms self 0.148ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.737 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 2.313ms self 2.313ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.737 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.288ms self 0.288ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.876 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.299ms self 0.299ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.876 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.709ms self 0.709ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.876 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.725ms self 0.725ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.876 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 5.678ms self 5.678ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.876 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.164ms self 0.164ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.876 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 5.277ms self 5.277ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.876 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.283ms self 0.283ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.876 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.298ms self 0.298ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.876 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.620ms self 1.620ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.876 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.737ms self 0.737ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.876 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 4.911ms self 4.911ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.148ms self 0.148ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 17.421ms self 17.421ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 9.950ms self 0.016ms children 9.934ms %children 99.84% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.091ms self 0.091ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 9.513ms self 9.513ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.330ms self 0.330ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 103.356ms self 0.015ms children 103.341ms %children 99.99% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 103.079ms self 0.045ms children 103.034ms %children 99.96% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 103.034ms self 0.172ms children 102.862ms %children 99.83% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.235ms self 0.235ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.360ms self 0.360ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.097ms self 1.097ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.070ms self 1.070ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 3.074ms self 3.074ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.299ms self 0.299ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 2.787ms self 2.787ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.548ms self 0.548ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 64.415ms self 64.415ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.062ms self 1.062ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.877 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.080ms self 1.080ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 3.245ms self 3.245ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.299ms self 0.299ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 2.753ms self 2.753ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 11.657ms self 11.657ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.364ms self 0.364ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.902ms self 0.902ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.047ms self 1.047ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 3.588ms self 3.588ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.298ms self 0.298ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 2.682ms self 2.682ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.227ms self 0.227ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 4.589ms self 4.589ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 56.114ms self 0.015ms children 56.099ms %children 99.97% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 55.964ms self 0.050ms children 55.914ms %children 99.91% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 55.914ms self 0.158ms children 55.755ms %children 99.72% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.330ms self 0.330ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.388ms self 0.388ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.372ms self 1.372ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.954ms self 0.954ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 7.300ms self 7.300ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.184ms self 0.184ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.878 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 4.052ms self 4.052ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.354ms self 0.354ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.302ms self 0.302ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.697ms self 0.697ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.936ms self 0.936ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 2.078ms self 2.078ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 9.396ms self 9.396ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 2.324ms self 2.324ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.818ms self 0.818ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.312ms self 0.312ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.683ms self 0.683ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.786ms self 1.786ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 3.963ms self 3.963ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.176ms self 0.176ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 17.349ms self 17.349ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.331ms self 0.011ms children 0.320ms %children 96.65% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.102ms self 0.102ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 45.159ms self 0.012ms children 45.147ms %children 99.97% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 45.028ms self 0.040ms children 44.989ms %children 99.91% 2023-04-22 21:09:45.879 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 44.989ms self 0.144ms children 44.845ms %children 99.68% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.204ms self 0.204ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.304ms self 0.304ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.693ms self 0.693ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.773ms self 0.773ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 22.262ms self 22.262ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.169ms self 0.169ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 2.321ms self 2.321ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.332ms self 0.332ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.323ms self 0.323ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.681ms self 0.681ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.749ms self 0.749ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 7.502ms self 7.502ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.163ms self 0.163ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 2.263ms self 2.263ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.307ms self 0.307ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.307ms self 0.307ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.683ms self 0.683ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.737ms self 0.737ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 1.714ms self 1.714ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.162ms self 0.162ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 2.197ms self 2.197ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.094ms self 0.094ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.880 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 41.658ms self 0.014ms children 41.644ms %children 99.97% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.094ms self 0.094ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 41.410ms self 41.410ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.140ms self 0.140ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 50.576ms self 0.012ms children 50.564ms %children 99.98% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 50.424ms self 0.038ms children 50.387ms %children 99.93% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 50.387ms self 0.143ms children 50.243ms %children 99.72% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.229ms self 0.229ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.359ms self 0.359ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.852ms self 0.852ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.849ms self 0.849ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 17.691ms self 17.691ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.165ms self 0.165ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 2.603ms self 2.603ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.426ms self 0.426ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.358ms self 0.358ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.789ms self 0.789ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.848ms self 0.848ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 4.562ms self 4.562ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.162ms self 0.162ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 2.577ms self 2.577ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.426ms self 0.426ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.881 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.348ms self 0.348ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.785ms self 0.785ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.868ms self 0.868ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 2.491ms self 2.491ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.158ms self 0.158ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 12.697ms self 12.697ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.107ms self 0.107ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 14.311ms self 14.311ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/InitializeCompiledFunction total 38.318ms self 38.318ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/RunCompiledFunction total 47.348s self 47.348s children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.431ms self 0.010ms children 0.421ms %children 97.68% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.412ms self 0.016ms children 0.396ms %children 96.05% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.396ms self 0.028ms children 0.368ms %children 93.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.044ms self 0.044ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.056ms self 0.056ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.080ms self 0.080ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.094ms self 0.094ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.882 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles total 0.030ms self 0.008ms children 0.021ms %children 71.73% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/LoweringTransformation total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.258ms self 0.007ms children 0.251ms %children 97.11% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.247ms self 0.013ms children 0.234ms %children 94.63% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.234ms self 0.067ms children 0.166ms %children 71.15% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable total 0.054ms self 0.007ms children 0.047ms %children 86.24% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.883 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.161ms self 0.006ms children 0.155ms %children 96.18% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.152ms self 0.010ms children 0.142ms %children 93.61% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.142ms self 0.019ms children 0.124ms %children 86.95% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/Compile total 132.006ms self 128.546ms children 3.460ms %children 2.62% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.904ms self 0.009ms children 0.895ms %children 99.01% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.889ms self 0.013ms children 0.875ms %children 98.49% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.875ms self 0.043ms children 0.832ms %children 95.13% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.606ms self 0.606ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.884 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.885 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.885 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.885 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.885 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.885 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.885 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.027ms self 0.007ms children 0.020ms %children 72.54% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.165ms self 0.006ms children 0.159ms %children 96.50% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.155ms self 0.010ms children 0.146ms %children 93.86% 2023-04-22 21:09:45.894 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.146ms self 0.019ms children 0.127ms %children 87.02% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.193ms self 0.008ms children 0.184ms %children 95.71% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.172ms self 0.172ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.199ms self 0.007ms children 0.193ms %children 96.62% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.189ms self 0.009ms children 0.179ms %children 94.98% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.179ms self 0.022ms children 0.157ms %children 87.60% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.172ms self 0.172ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.384ms self 0.019ms children 0.364ms %children 94.96% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.351ms self 0.016ms children 0.335ms %children 95.34% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.335ms self 0.022ms children 0.313ms %children 93.45% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.895 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.020ms self 0.003ms children 0.017ms %children 83.42% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.146ms self 0.003ms children 0.143ms %children 97.97% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.140ms self 0.007ms children 0.133ms %children 95.28% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.133ms self 0.012ms children 0.122ms %children 91.22% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.114ms self 0.003ms children 0.111ms %children 97.03% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.153ms self 0.003ms children 0.150ms %children 98.02% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.146ms self 0.006ms children 0.140ms %children 95.59% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.140ms self 0.012ms children 0.128ms %children 91.45% 2023-04-22 21:09:45.896 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.109ms self 0.109ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.371ms self 0.004ms children 0.366ms %children 98.79% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.359ms self 0.009ms children 0.351ms %children 97.61% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.351ms self 0.023ms children 0.328ms %children 93.49% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.075ms self 0.075ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.018ms self 0.003ms children 0.015ms %children 81.74% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.138ms self 0.003ms children 0.135ms %children 97.84% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.897 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.132ms self 0.006ms children 0.126ms %children 95.11% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.126ms self 0.012ms children 0.114ms %children 90.72% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.103ms self 0.003ms children 0.100ms %children 96.81% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.090ms self 0.090ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.154ms self 0.003ms children 0.151ms %children 97.78% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.148ms self 0.007ms children 0.141ms %children 95.57% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.141ms self 0.012ms children 0.129ms %children 91.58% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.089ms self 0.089ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/InitializeCompiledFunction total 3.488ms self 3.488ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.898 : INFO: timing SparkBackend.executeEncode/RunCompiledFunction total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:45.918 Hail: INFO: hwe_normalize: found 114591 variants after filtering out monomorphic sites. 2023-04-22 21:09:46.085 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:09:46.089 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:09:46.089 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:09:46.089 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:09:46.089 : INFO: RegionPool: FREE: 64.0K allocated (64.0K blocks / 0 chunks), regions.size = 1, 0 current java objects, thread 14: Thread-5 2023-04-22 21:09:46.089 : INFO: timing SparkBackend.parse_value_ir total 3.594ms self 3.594ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.089 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:09:46.090 : INFO: starting execution of query hail_query_2 of initial size 5 2023-04-22 21:09:46.090 : INFO: initial IR: IR size 5: (Let __rng_state (RNGStateLiteral) (MakeTuple (0) (MakeStruct (__gt (NA Int32))))) 2023-04-22 21:09:46.091 : INFO: after optimize: relationalLowerer, initial IR: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:09:46.092 : INFO: after LowerMatrixToTable: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:09:46.092 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:09:46.092 : INFO: after LiftRelationalValuesToRelationalLets: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:09:46.092 : INFO: after EvalRelationalLets: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:09:46.092 : INFO: after LowerAndExecuteShuffles: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:09:46.092 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:09:46.092 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:09:46.093 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:09:46.093 : INFO: initial IR: IR size 2: (MakeTuple (0) (Literal Tuple[Struct{__gt:Int32}] )) 2023-04-22 21:09:46.093 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.094 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.094 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.094 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.094 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.104 : INFO: encoder cache miss (4 hits, 4 misses, 0.500) 2023-04-22 21:09:46.107 : INFO: instruction count: 3: __C470HailClassLoaderContainer. 2023-04-22 21:09:46.107 : INFO: instruction count: 3: __C470HailClassLoaderContainer. 2023-04-22 21:09:46.108 : INFO: instruction count: 3: __C472FSContainer. 2023-04-22 21:09:46.108 : INFO: instruction count: 3: __C472FSContainer. 2023-04-22 21:09:46.109 : INFO: instruction count: 3: __C474etypeEncode. 2023-04-22 21:09:46.109 : INFO: instruction count: 7: __C474etypeEncode.apply 2023-04-22 21:09:46.109 : INFO: instruction count: 9: __C474etypeEncode.__m476ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_struct_of_o_int32ENDENDENDEND 2023-04-22 21:09:46.109 : INFO: instruction count: 9: __C474etypeEncode.__m477ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_o_int32ENDENDEND 2023-04-22 21:09:46.109 : INFO: instruction count: 9: __C474etypeEncode.__m478ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_o_int32ENDEND 2023-04-22 21:09:46.109 : INFO: instruction count: 36: __C474etypeEncode.__m479ENCODE_SBaseStructPointer_TO_r_struct_of_o_int32END 2023-04-22 21:09:46.109 : INFO: instruction count: 4: __C474etypeEncode.__m480ENCODE_SInt32$_TO_o_int32 2023-04-22 21:09:46.112 MemoryStore: INFO: Block broadcast_10 stored as values in memory (estimated size 104.0 B, free 28.8 GiB) 2023-04-22 21:09:46.141 MemoryStore: INFO: Block broadcast_10_piece0 stored as bytes in memory (estimated size 58.0 B, free 28.8 GiB) 2023-04-22 21:09:46.144 BlockManagerInfo: INFO: Added broadcast_10_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 58.0 B, free: 28.8 GiB) 2023-04-22 21:09:46.144 SparkContext: INFO: Created broadcast 10 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:46.145 : INFO: instruction count: 3: __C455HailClassLoaderContainer. 2023-04-22 21:09:46.145 : INFO: instruction count: 3: __C455HailClassLoaderContainer. 2023-04-22 21:09:46.145 : INFO: instruction count: 3: __C457FSContainer. 2023-04-22 21:09:46.145 : INFO: instruction count: 3: __C457FSContainer. 2023-04-22 21:09:46.147 : INFO: instruction count: 3: __C459Compiled. 2023-04-22 21:09:46.147 : INFO: instruction count: 7: __C459Compiled.apply 2023-04-22 21:09:46.147 : INFO: instruction count: 9: __C459Compiled.setPartitionIndex 2023-04-22 21:09:46.147 : INFO: instruction count: 4: __C459Compiled.addPartitionRegion 2023-04-22 21:09:46.147 : INFO: instruction count: 4: __C459Compiled.setPool 2023-04-22 21:09:46.147 : INFO: instruction count: 3: __C459Compiled.addHailClassLoader 2023-04-22 21:09:46.147 : INFO: instruction count: 3: __C459Compiled.addFS 2023-04-22 21:09:46.147 : INFO: instruction count: 4: __C459Compiled.addTaskContext 2023-04-22 21:09:46.147 : INFO: instruction count: 41: __C459Compiled.addAndDecodeLiterals 2023-04-22 21:09:46.147 : INFO: instruction count: 27: __C459Compiled.__m465DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_o_int32ENDENDENDEND_TO_SBaseStructPointer 2023-04-22 21:09:46.148 : INFO: instruction count: 17: __C459Compiled.__m466INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_o_int32ENDENDEND_TO_r_tuple_of_r_tuple_of_r_struct_of_o_int32ENDENDEND 2023-04-22 21:09:46.148 : INFO: instruction count: 17: __C459Compiled.__m467INPLACE_DECODE_r_struct_of_r_struct_of_o_int32ENDEND_TO_r_tuple_of_r_struct_of_o_int32ENDEND 2023-04-22 21:09:46.148 : INFO: instruction count: 48: __C459Compiled.__m468INPLACE_DECODE_r_struct_of_o_int32END_TO_r_struct_of_o_int32END 2023-04-22 21:09:46.148 : INFO: instruction count: 10: __C459Compiled.__m469INPLACE_DECODE_o_int32_TO_o_int32 2023-04-22 21:09:46.148 : INFO: initial IR: IR size 2: (MakeTuple (0) (Literal Tuple[Struct{__gt:Int32}] )) 2023-04-22 21:09:46.149 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.149 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.149 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.149 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.149 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.170 : INFO: encoder cache hit 2023-04-22 21:09:46.171 MemoryStore: INFO: Block broadcast_11 stored as values in memory (estimated size 104.0 B, free 28.8 GiB) 2023-04-22 21:09:46.173 MemoryStore: INFO: Block broadcast_11_piece0 stored as bytes in memory (estimated size 58.0 B, free 28.8 GiB) 2023-04-22 21:09:46.179 BlockManagerInfo: INFO: Added broadcast_11_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 58.0 B, free: 28.8 GiB) 2023-04-22 21:09:46.180 SparkContext: INFO: Created broadcast 11 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:46.180 : INFO: instruction count: 3: __C481HailClassLoaderContainer. 2023-04-22 21:09:46.180 : INFO: instruction count: 3: __C481HailClassLoaderContainer. 2023-04-22 21:09:46.181 : INFO: instruction count: 3: __C483FSContainer. 2023-04-22 21:09:46.181 : INFO: instruction count: 3: __C483FSContainer. 2023-04-22 21:09:46.182 : INFO: instruction count: 3: __C485Compiled. 2023-04-22 21:09:46.182 : INFO: instruction count: 7: __C485Compiled.apply 2023-04-22 21:09:46.182 : INFO: instruction count: 9: __C485Compiled.setPartitionIndex 2023-04-22 21:09:46.183 : INFO: instruction count: 4: __C485Compiled.addPartitionRegion 2023-04-22 21:09:46.183 : INFO: instruction count: 4: __C485Compiled.setPool 2023-04-22 21:09:46.183 : INFO: instruction count: 3: __C485Compiled.addHailClassLoader 2023-04-22 21:09:46.183 : INFO: instruction count: 3: __C485Compiled.addFS 2023-04-22 21:09:46.183 : INFO: instruction count: 4: __C485Compiled.addTaskContext 2023-04-22 21:09:46.183 : INFO: instruction count: 41: __C485Compiled.addAndDecodeLiterals 2023-04-22 21:09:46.183 : INFO: instruction count: 27: __C485Compiled.__m491DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_o_int32ENDENDENDEND_TO_SBaseStructPointer 2023-04-22 21:09:46.183 : INFO: instruction count: 17: __C485Compiled.__m492INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_o_int32ENDENDEND_TO_r_tuple_of_r_tuple_of_r_struct_of_o_int32ENDENDEND 2023-04-22 21:09:46.183 : INFO: instruction count: 17: __C485Compiled.__m493INPLACE_DECODE_r_struct_of_r_struct_of_o_int32ENDEND_TO_r_tuple_of_r_struct_of_o_int32ENDEND 2023-04-22 21:09:46.183 : INFO: instruction count: 48: __C485Compiled.__m494INPLACE_DECODE_r_struct_of_o_int32END_TO_r_struct_of_o_int32END 2023-04-22 21:09:46.183 : INFO: instruction count: 10: __C485Compiled.__m495INPLACE_DECODE_o_int32_TO_o_int32 2023-04-22 21:09:46.184 : INFO: initial IR: IR size 2: (MakeTuple (0) (Literal Tuple[Struct{__gt:Int32}] )) 2023-04-22 21:09:46.184 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.184 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.184 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.185 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.185 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Struct{__gt:Int32}]] ) 2023-04-22 21:09:46.216 : INFO: encoder cache hit 2023-04-22 21:09:46.217 MemoryStore: INFO: Block broadcast_12 stored as values in memory (estimated size 104.0 B, free 28.8 GiB) 2023-04-22 21:09:46.219 MemoryStore: INFO: Block broadcast_12_piece0 stored as bytes in memory (estimated size 58.0 B, free 28.8 GiB) 2023-04-22 21:09:46.222 BlockManagerInfo: INFO: Added broadcast_12_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 58.0 B, free: 28.8 GiB) 2023-04-22 21:09:46.223 SparkContext: INFO: Created broadcast 12 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:46.223 : INFO: instruction count: 3: __C496HailClassLoaderContainer. 2023-04-22 21:09:46.223 : INFO: instruction count: 3: __C496HailClassLoaderContainer. 2023-04-22 21:09:46.224 : INFO: instruction count: 3: __C498FSContainer. 2023-04-22 21:09:46.224 : INFO: instruction count: 3: __C498FSContainer. 2023-04-22 21:09:46.225 : INFO: instruction count: 3: __C500Compiled. 2023-04-22 21:09:46.225 : INFO: instruction count: 7: __C500Compiled.apply 2023-04-22 21:09:46.225 : INFO: instruction count: 9: __C500Compiled.setPartitionIndex 2023-04-22 21:09:46.225 : INFO: instruction count: 4: __C500Compiled.addPartitionRegion 2023-04-22 21:09:46.225 : INFO: instruction count: 4: __C500Compiled.setPool 2023-04-22 21:09:46.225 : INFO: instruction count: 3: __C500Compiled.addHailClassLoader 2023-04-22 21:09:46.225 : INFO: instruction count: 3: __C500Compiled.addFS 2023-04-22 21:09:46.226 : INFO: instruction count: 4: __C500Compiled.addTaskContext 2023-04-22 21:09:46.226 : INFO: instruction count: 41: __C500Compiled.addAndDecodeLiterals 2023-04-22 21:09:46.226 : INFO: instruction count: 27: __C500Compiled.__m506DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_o_int32ENDENDENDEND_TO_SBaseStructPointer 2023-04-22 21:09:46.226 : INFO: instruction count: 17: __C500Compiled.__m507INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_o_int32ENDENDEND_TO_r_tuple_of_r_tuple_of_r_struct_of_o_int32ENDENDEND 2023-04-22 21:09:46.226 : INFO: instruction count: 17: __C500Compiled.__m508INPLACE_DECODE_r_struct_of_r_struct_of_o_int32ENDEND_TO_r_tuple_of_r_struct_of_o_int32ENDEND 2023-04-22 21:09:46.226 : INFO: instruction count: 48: __C500Compiled.__m509INPLACE_DECODE_r_struct_of_o_int32END_TO_r_struct_of_o_int32END 2023-04-22 21:09:46.226 : INFO: instruction count: 10: __C500Compiled.__m510INPLACE_DECODE_o_int32_TO_o_int32 2023-04-22 21:09:46.229 : INFO: encoder cache miss (6 hits, 5 misses, 0.545) 2023-04-22 21:09:46.231 : INFO: instruction count: 3: __C511HailClassLoaderContainer. 2023-04-22 21:09:46.231 : INFO: instruction count: 3: __C511HailClassLoaderContainer. 2023-04-22 21:09:46.233 : INFO: instruction count: 3: __C513FSContainer. 2023-04-22 21:09:46.233 : INFO: instruction count: 3: __C513FSContainer. 2023-04-22 21:09:46.234 : INFO: instruction count: 3: __C515etypeEncode. 2023-04-22 21:09:46.234 : INFO: instruction count: 7: __C515etypeEncode.apply 2023-04-22 21:09:46.234 : INFO: instruction count: 17: __C515etypeEncode.__m517ENCODE_SBaseStructPointer_TO_o_struct_of_o_struct_of_o_int32ENDEND 2023-04-22 21:09:46.234 : INFO: instruction count: 36: __C515etypeEncode.__m518ENCODE_SBaseStructPointer_TO_o_struct_of_o_int32END 2023-04-22 21:09:46.234 : INFO: instruction count: 4: __C515etypeEncode.__m519ENCODE_SInt32$_TO_o_int32 2023-04-22 21:09:46.235 : INFO: finished execution of query hail_query_2, result size is 2.00 B 2023-04-22 21:09:46.235 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:09:46.235 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:09:46.235 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:09:46.235 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:09:46.235 : INFO: timing SparkBackend.executeEncode total 146.152ms self 9.522ms children 136.630ms %children 93.48% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR total 0.842ms self 0.006ms children 0.836ms %children 99.27% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation total 0.822ms self 0.026ms children 0.796ms %children 96.88% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 0.796ms self 0.042ms children 0.755ms %children 94.75% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.208ms self 0.208ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.210ms self 0.210ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable total 0.031ms self 0.004ms children 0.028ms %children 88.75% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/LoweringTransformation total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable total 0.179ms self 0.003ms children 0.176ms %children 98.27% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 0.173ms self 0.009ms children 0.164ms %children 94.96% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 0.164ms self 0.012ms children 0.152ms %children 92.54% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets total 0.025ms self 0.003ms children 0.023ms %children 88.63% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.236 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets total 0.017ms self 0.003ms children 0.014ms %children 84.75% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles total 0.020ms self 0.003ms children 0.018ms %children 87.29% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/LoweringTransformation total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.153ms self 0.003ms children 0.149ms %children 97.99% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.147ms self 0.007ms children 0.139ms %children 95.04% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.139ms self 0.012ms children 0.127ms %children 91.05% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable total 0.031ms self 0.003ms children 0.027ms %children 89.85% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.146ms self 0.003ms children 0.143ms %children 97.82% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.140ms self 0.009ms children 0.131ms %children 93.26% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.131ms self 0.012ms children 0.119ms %children 91.11% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.237 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile total 133.609ms self 130.833ms children 2.777ms %children 2.08% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.299ms self 0.004ms children 0.294ms %children 98.55% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.289ms self 0.008ms children 0.281ms %children 97.22% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.281ms self 0.022ms children 0.260ms %children 92.27% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.020ms self 0.003ms children 0.017ms %children 82.80% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.143ms self 0.003ms children 0.140ms %children 97.89% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.137ms self 0.007ms children 0.130ms %children 94.87% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.130ms self 0.012ms children 0.118ms %children 90.64% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.238 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.195ms self 0.004ms children 0.191ms %children 98.13% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.179ms self 0.179ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.144ms self 0.003ms children 0.141ms %children 97.68% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.138ms self 0.007ms children 0.131ms %children 95.09% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.131ms self 0.013ms children 0.118ms %children 90.35% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.158ms self 0.158ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.348ms self 0.004ms children 0.344ms %children 98.78% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.337ms self 0.008ms children 0.329ms %children 97.60% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.329ms self 0.036ms children 0.293ms %children 88.94% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.045ms self 0.045ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.020ms self 0.003ms children 0.016ms %children 82.55% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.239 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.153ms self 0.003ms children 0.149ms %children 97.81% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.146ms self 0.007ms children 0.140ms %children 95.49% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.140ms self 0.012ms children 0.127ms %children 91.14% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.143ms self 0.004ms children 0.139ms %children 97.47% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.127ms self 0.127ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.147ms self 0.003ms children 0.143ms %children 97.69% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.140ms self 0.007ms children 0.133ms %children 95.08% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.133ms self 0.013ms children 0.121ms %children 90.55% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.133ms self 0.133ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.310ms self 0.004ms children 0.306ms %children 98.71% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.299ms self 0.023ms children 0.276ms %children 92.31% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.276ms self 0.021ms children 0.255ms %children 92.41% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.240 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.241 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.241 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.241 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.241 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.241 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.241 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.241 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.241 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.241 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.241 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.241 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.019ms self 0.004ms children 0.016ms %children 81.93% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.144ms self 0.003ms children 0.141ms %children 97.93% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.137ms self 0.007ms children 0.131ms %children 95.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.131ms self 0.013ms children 0.118ms %children 90.39% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.137ms self 0.003ms children 0.134ms %children 97.55% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.124ms self 0.124ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.264 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.146ms self 0.003ms children 0.142ms %children 97.74% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.139ms self 0.007ms children 0.132ms %children 95.06% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.132ms self 0.013ms children 0.119ms %children 90.28% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.117ms self 0.117ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/InitializeCompiledFunction total 1.574ms self 1.574ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.265 : INFO: timing SparkBackend.executeEncode/RunCompiledFunction total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.389 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:09:46.477 : INFO: JSON: JObject(List((name,JString(PCA)), (entryField,JString(__uid_3)), (k,JInt(10)), (computeLoadings,JBool(false)))) 2023-04-22 21:09:46.495 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:09:46.495 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:09:46.495 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:09:46.495 : INFO: RegionPool: FREE: 64.0K allocated (64.0K blocks / 0 chunks), regions.size = 1, 0 current java objects, thread 14: Thread-5 2023-04-22 21:09:46.495 : INFO: timing SparkBackend.parse_value_ir total 106.254ms self 106.254ms children 0.000ms %children 0.00% 2023-04-22 21:09:46.496 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:09:46.496 : INFO: starting execution of query hail_query_3 of initial size 115 2023-04-22 21:09:46.563 : INFO: initial IR: IR size 115: (Let __rng_state (RNGStateLiteral) (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (MatrixToTableApply "{\"name\":\"PCA\",\"entryField\":\"__uid_3\",\"k\":10,\"computeLoadings\":false}" (MatrixMapGlobals (MatrixMapRows (MatrixMapCols None (MatrixMapEntries (MatrixMapEntries (MatrixMapRows (MatrixMapRows (MatrixFilterRows (MatrixMapRows (MatrixMapEntries (MatrixRead Matrix{global:Struct{},col_key:[s],col:Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean},row_key:[[locus,alleles]],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64},entry:Struct{GT:Call}} False False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (InsertFields (SelectFields () (SelectFields (GT) (Ref g))) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref g)))))) (AggLet __cse_1 False (GetField __gt (Ref g)) (InsertFields (SelectFields (locus alleles rsid cm_position) (Ref va)) None (__AC (ApplyAggOp Sum () ((ApplyIR 2 toInt64 () Int64 (Ref __cse_1))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __cse_1)))))))))) (Let __cse_2 (GetField __AC (Ref va)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __cse_2) (ApplyIR 4 toInt64 () Int64 (I32 0))) (ApplyComparisonOp LT (Ref __cse_2) (ApplyBinaryPrimOp Multiply (ApplyIR 5 toInt64 () Int64 (I32 2)) (GetField __n_called (Ref va))))) (False)))) (InsertFields (SelectFields (locus alleles rsid cm_position __AC __n_called) (Ref va)) None (__mean_gt (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (Ref va)) (GetField __n_called (Ref va)))))) (Let __cse_3 (GetField __mean_gt (Ref va)) (InsertFields (SelectFields (locus alleles rsid cm_position __AC __n_called __mean_gt) (Ref va)) None (__hwe_scaled_std_dev (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __cse_3) (ApplyBinaryPrimOp Subtract (ApplyIR 7 toFloat64 () Float64 (I32 2)) (Ref __cse_3))) (ApplyIR 8 toFloat64 () Float64 (I32 114591))) (ApplyIR 9 toFloat64 () Float64 (I32 2)))))))) (If (ApplyUnaryPrimOp Bang (IsNA (SelectFields (__gt) (Ref g)))) (SelectFields (__gt) (Ref g)) (Literal Struct{__gt:Int32} ))) (InsertFields (SelectFields () (SelectFields (__gt) (Ref g))) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (ApplyIR 11 toFloat64 () Float64 (GetField __gt (Ref g))) (GetField __mean_gt (Ref va))) (GetField __hwe_scaled_std_dev (Ref va))) (F64 0.0))))) (InsertFields (SelectFields (s) (SelectFields (s fam_id pat_id mat_id is_female is_case) (Ref sa))) None)) (InsertFields (SelectFields (locus alleles) (SelectFields (locus alleles rsid cm_position __AC __n_called __mean_gt __hwe_scaled_std_dev) (Ref va))) None)) (InsertFields (SelectFields () (SelectFields () (Ref global))) None))))) 2023-04-22 21:09:46.712 : INFO: after optimize: relationalLowerer, initial IR: IR size 84: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (MatrixToTableApply "{\"name\":\"PCA\",\"entryField\":\"__uid_3\",\"k\":10,\"computeLoadings\":false}" (MatrixMapRows (MatrixMapEntries (MatrixMapRows (MatrixFilterRows (MatrixMapRows (MatrixMapEntries (MatrixRead Matrix{global:Struct{},col_key:[s],col:Struct{s:String},row_key:[[locus,alleles]],row:Struct{locus:Locus(GRCh38),alleles:Array[String]},entry:Struct{GT:Call}} False False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (InsertFields (SelectFields () (Ref g)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref g)))))) (AggLet __iruid_595 False (GetField __gt (Ref g)) (InsertFields (Ref va) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_595))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_595)))))))))) (Let __iruid_596 (GetField __AC (Ref va)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_596) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_596) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref va))))) (False)))) (Let __iruid_597 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (Ref va)) (GetField __n_called (Ref va))) (InsertFields (Ref va) ("locus" "alleles" "__AC" "__n_called" "__mean_gt" "__hwe_scaled_std_dev") (__mean_gt (Ref __iruid_597)) (__hwe_scaled_std_dev (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_597) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_597))) (F64 114591.0)) (F64 2.0))))))) (Let __iruid_598 (If (IsNA (Ref g)) (Literal Struct{__gt:Int32} ) (Ref g)) (InsertFields (SelectFields () (Ref __iruid_598)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_598))) (GetField __mean_gt (Ref va))) (GetField __hwe_scaled_std_dev (Ref va))) (F64 0.0)))))) (SelectFields (locus alleles) (Ref va))))) 2023-04-22 21:09:46.746 : INFO: after LowerMatrixToTable: IR size 168: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (TableToTableApply "{\"name\":\"WrappedMatrixToTableFunction\",\"function\":{\"name\":\"PCA\",\"entryField\":\"__uid_3\",\"k\":10,\"computeLoadings\":false},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableMapRows (TableMapRows (TableFilter (TableMapRows (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (InsertFields (Ref row) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamZip -1 AssumeSameLength (g sa) (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (ToStream False (GetField __cols (Ref global))) (InsertFields (SelectFields () (Ref g)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref g)))))))))) (Let n_cols (ArrayLen (GetField __cols (Ref global))) (InsertFields (Let __iruid_599 (MakeStruct) (StreamAgg i (ToStream False (ToArray (StreamFilter i (ToStream False (ToArray (StreamRange -1 False (I32 0) (ArrayLen (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (I32 1)))) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row)) (Ref i))))))) (AggLet sa False (ArrayRef -1 (GetField __cols (Ref global)) (Ref i)) (AggLet g False (ArrayRef -1 (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row)) (Ref i)) (AggLet __iruid_595 False (GetField __gt (Ref g)) (InsertFields (SelectFields (locus alleles) (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_595))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_595))))))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row)))))) (Let __iruid_596 (GetField __AC (SelectFields (locus alleles __AC __n_called) (Ref row))) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_596) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_596) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (SelectFields (locus alleles __AC __n_called) (Ref row)))))) (False)))) (Let n_cols (ArrayLen (GetField __cols (Ref global))) (InsertFields (Let __iruid_600 (MakeStruct) (Let __iruid_597 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (SelectFields (locus alleles __AC __n_called) (Ref row))) (GetField __n_called (SelectFields (locus alleles __AC __n_called) (Ref row)))) (InsertFields (SelectFields (locus alleles __AC __n_called) (Ref row)) ("locus" "alleles" "__AC" "__n_called" "__mean_gt" "__hwe_scaled_std_dev") (__mean_gt (Ref __iruid_597)) (__hwe_scaled_std_dev (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_597) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_597))) (F64 114591.0)) (F64 2.0))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row)))))) (InsertFields (Ref row) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamZip -1 AssumeSameLength (g sa) (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (ToStream False (GetField __cols (Ref global))) (Let __iruid_598 (If (IsNA (Ref g)) (Literal Struct{__gt:Int32} ) (Ref g)) (InsertFields (SelectFields () (Ref __iruid_598)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_598))) (GetField __mean_gt (SelectFields (locus alleles __AC __n_called __mean_gt __hwe_scaled_std_dev) (Ref row)))) (GetField __hwe_scaled_std_dev (SelectFields (locus alleles __AC __n_called __mean_gt __hwe_scaled_std_dev) (Ref row)))) (F64 0.0)))))))))) (Let n_cols (ArrayLen (GetField __cols (Ref global))) (InsertFields (Let __iruid_601 (MakeStruct) (SelectFields (locus alleles) (SelectFields (locus alleles __AC __n_called __mean_gt __hwe_scaled_std_dev) (Ref row)))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row)))))))) 2023-04-22 21:09:46.894 BlockManagerInfo: INFO: Removed broadcast_7_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 65.0 B, free: 28.8 GiB) 2023-04-22 21:09:46.939 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 106: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (TableToTableApply "{\"name\":\"WrappedMatrixToTableFunction\",\"function\":{\"name\":\"PCA\",\"entryField\":\"__uid_3\",\"k\":10,\"computeLoadings\":false},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableFilter (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __iruid_666 (ToArray (StreamMap __iruid_667 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (InsertFields (SelectFields () (Ref __iruid_667)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_667))))))) (InsertFields (StreamAgg __iruid_668 (StreamFilter __iruid_669 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_666)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_666) (Ref __iruid_669))))) (AggLet __iruid_670 False (GetField __gt (ArrayRef -1 (Ref __iruid_666) (Ref __iruid_668))) (InsertFields (SelectFields (locus alleles) (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_670))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_670)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_666))))) (Let __iruid_671 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_671) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_671) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False)))) (Let __iruid_672 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (Ref row)) (GetField __n_called (Ref row))) (Let __iruid_673 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_672) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_672))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (Ref row)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_674 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_675 (If (IsNA (Ref __iruid_674)) (Literal Struct{__gt:Int32} ) (Ref __iruid_674)) (InsertFields (SelectFields () (Ref __iruid_675)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_675))) (Ref __iruid_672)) (Ref __iruid_673)) (F64 0.0)))))))))))))) 2023-04-22 21:09:46.956 : INFO: after LiftRelationalValuesToRelationalLets: IR size 106: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (TableToTableApply "{\"name\":\"WrappedMatrixToTableFunction\",\"function\":{\"name\":\"PCA\",\"entryField\":\"__uid_3\",\"k\":10,\"computeLoadings\":false},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableFilter (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __iruid_666 (ToArray (StreamMap __iruid_667 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (InsertFields (SelectFields () (Ref __iruid_667)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_667))))))) (InsertFields (StreamAgg __iruid_668 (StreamFilter __iruid_669 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_666)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_666) (Ref __iruid_669))))) (AggLet __iruid_670 False (GetField __gt (ArrayRef -1 (Ref __iruid_666) (Ref __iruid_668))) (InsertFields (SelectFields (locus alleles) (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_670))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_670)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_666))))) (Let __iruid_671 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_671) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_671) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False)))) (Let __iruid_672 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (Ref row)) (GetField __n_called (Ref row))) (Let __iruid_673 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_672) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_672))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (Ref row)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_674 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_675 (If (IsNA (Ref __iruid_674)) (Literal Struct{__gt:Int32} ) (Ref __iruid_674)) (InsertFields (SelectFields () (Ref __iruid_675)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_675))) (Ref __iruid_672)) (Ref __iruid_673)) (F64 0.0)))))))))))))) 2023-04-22 21:09:46.960 : INFO: after EvalRelationalLets: IR size 106: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (TableToTableApply "{\"name\":\"WrappedMatrixToTableFunction\",\"function\":{\"name\":\"PCA\",\"entryField\":\"__uid_3\",\"k\":10,\"computeLoadings\":false},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableFilter (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __iruid_666 (ToArray (StreamMap __iruid_667 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (InsertFields (SelectFields () (Ref __iruid_667)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_667))))))) (InsertFields (StreamAgg __iruid_668 (StreamFilter __iruid_669 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_666)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_666) (Ref __iruid_669))))) (AggLet __iruid_670 False (GetField __gt (ArrayRef -1 (Ref __iruid_666) (Ref __iruid_668))) (InsertFields (SelectFields (locus alleles) (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_670))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_670)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_666))))) (Let __iruid_671 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_671) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_671) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False)))) (Let __iruid_672 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (Ref row)) (GetField __n_called (Ref row))) (Let __iruid_673 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_672) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_672))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (Ref row)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_674 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_675 (If (IsNA (Ref __iruid_674)) (Literal Struct{__gt:Int32} ) (Ref __iruid_674)) (InsertFields (SelectFields () (Ref __iruid_675)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_675))) (Ref __iruid_672)) (Ref __iruid_673)) (F64 0.0)))))))))))))) 2023-04-22 21:09:46.984 : INFO: after LowerAndExecuteShuffles: IR size 106: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (TableToTableApply "{\"name\":\"WrappedMatrixToTableFunction\",\"function\":{\"name\":\"PCA\",\"entryField\":\"__uid_3\",\"k\":10,\"computeLoadings\":false},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableFilter (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __iruid_666 (ToArray (StreamMap __iruid_667 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (InsertFields (SelectFields () (Ref __iruid_667)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_667))))))) (InsertFields (StreamAgg __iruid_668 (StreamFilter __iruid_669 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_666)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_666) (Ref __iruid_669))))) (AggLet __iruid_670 False (GetField __gt (ArrayRef -1 (Ref __iruid_666) (Ref __iruid_668))) (InsertFields (SelectFields (locus alleles) (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_670))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_670)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_666))))) (Let __iruid_671 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_671) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_671) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False)))) (Let __iruid_672 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (Ref row)) (GetField __n_called (Ref row))) (Let __iruid_673 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_672) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_672))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (Ref row)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_674 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_675 (If (IsNA (Ref __iruid_674)) (Literal Struct{__gt:Int32} ) (Ref __iruid_674)) (InsertFields (SelectFields () (Ref __iruid_675)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_675))) (Ref __iruid_672)) (Ref __iruid_673)) (F64 0.0)))))))))))))) 2023-04-22 21:09:47.026 BlockManagerInfo: INFO: Removed broadcast_10_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 58.0 B, free: 28.8 GiB) 2023-04-22 21:09:47.071 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 106: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (TableToTableApply "{\"name\":\"WrappedMatrixToTableFunction\",\"function\":{\"name\":\"PCA\",\"entryField\":\"__uid_3\",\"k\":10,\"computeLoadings\":false},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableFilter (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __iruid_696 (ToArray (StreamMap __iruid_697 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (InsertFields (SelectFields () (Ref __iruid_697)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_697))))))) (InsertFields (StreamAgg __iruid_698 (StreamFilter __iruid_699 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_696)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_696) (Ref __iruid_699))))) (AggLet __iruid_700 False (GetField __gt (ArrayRef -1 (Ref __iruid_696) (Ref __iruid_698))) (InsertFields (SelectFields (locus alleles) (Ref row)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_700))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_700)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_696))))) (Let __iruid_701 (GetField __AC (Ref row)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_701) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_701) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (Ref row))))) (False)))) (Let __iruid_702 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (Ref row)) (GetField __n_called (Ref row))) (Let __iruid_703 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_702) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_702))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (Ref row)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_704 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_705 (If (IsNA (Ref __iruid_704)) (Literal Struct{__gt:Int32} ) (Ref __iruid_704)) (InsertFields (SelectFields () (Ref __iruid_705)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_705))) (Ref __iruid_702)) (Ref __iruid_703)) (F64 0.0)))))))))))))) 2023-04-22 21:09:47.073 : INFO: LowerOrInterpretNonCompilable: cannot efficiently lower query: TableToTableApply 2023-04-22 21:09:47.073 : INFO: interpreting non-compilable result: TableWrite 2023-04-22 21:09:47.106 MemoryStore: INFO: Block broadcast_13 stored as values in memory (estimated size 47.3 MiB, free 28.7 GiB) 2023-04-22 21:09:47.153 BlockManagerInfo: INFO: Removed broadcast_12_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 58.0 B, free: 28.8 GiB) 2023-04-22 21:09:47.271 BlockManagerInfo: INFO: Removed broadcast_9_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 65.0 B, free: 28.8 GiB) 2023-04-22 21:09:47.859 MemoryStore: INFO: Block broadcast_13_piece0 stored as bytes in memory (estimated size 2.3 MiB, free 28.7 GiB) 2023-04-22 21:09:47.860 BlockManagerInfo: INFO: Added broadcast_13_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 2.3 MiB, free: 28.8 GiB) 2023-04-22 21:09:47.862 SparkContext: INFO: Created broadcast 13 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:47.963 MemoryStore: INFO: Block broadcast_14 stored as values in memory (estimated size 429.5 KiB, free 28.7 GiB) 2023-04-22 21:09:48.004 MemoryStore: INFO: Block broadcast_14_piece0 stored as bytes in memory (estimated size 32.5 KiB, free 28.7 GiB) 2023-04-22 21:09:48.004 BlockManagerInfo: INFO: Added broadcast_14_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 32.5 KiB, free: 28.8 GiB) 2023-04-22 21:09:48.005 SparkContext: INFO: Created broadcast 14 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:48.014 : INFO: initial IR: IR size 45: (Coalesce (Let __iruid_696 (ToArray (StreamMap __iruid_697 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_697)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_697))))))) (InsertFields (StreamAgg __iruid_698 (StreamFilter __iruid_699 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_696)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_696) (Ref __iruid_699))))) (AggLet __iruid_700 False (GetField __gt (ArrayRef -1 (Ref __iruid_696) (Ref __iruid_698))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_700))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_700)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_696)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.073 : INFO: after optimize: compileLowerer, initial IR: IR size 45: (Coalesce (Let __iruid_717 (ToArray (StreamMap __iruid_718 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_718)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_718))))))) (InsertFields (StreamAgg __iruid_719 (StreamFilter __iruid_720 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_717)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_717) (Ref __iruid_720))))) (AggLet __iruid_721 False (GetField __gt (ArrayRef -1 (Ref __iruid_717) (Ref __iruid_719))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_721))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_721)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_717)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.075 : INFO: after InlineApplyIR: IR size 45: (Coalesce (Let __iruid_717 (ToArray (StreamMap __iruid_718 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_718)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_718))))))) (InsertFields (StreamAgg __iruid_719 (StreamFilter __iruid_720 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_717)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_717) (Ref __iruid_720))))) (AggLet __iruid_721 False (GetField __gt (ArrayRef -1 (Ref __iruid_717) (Ref __iruid_719))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_721))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_721)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_717)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.106 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 45: (Coalesce (Let __iruid_732 (ToArray (StreamMap __iruid_733 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_733)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_733))))))) (InsertFields (StreamAgg __iruid_734 (StreamFilter __iruid_735 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_732)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_732) (Ref __iruid_735))))) (AggLet __iruid_736 False (GetField __gt (ArrayRef -1 (Ref __iruid_732) (Ref __iruid_734))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_736))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_736)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_732)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.114 : INFO: after LowerArrayAggsToRunAggs: IR size 59: (Coalesce (Let __iruid_732 (ToArray (StreamMap __iruid_733 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_733)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_733))))))) (InsertFields (Let __iruid_737 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_734 (StreamFilter __iruid_735 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_732)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_732) (Ref __iruid_735))))) (Let __iruid_736 (GetField __gt (ArrayRef -1 (Ref __iruid_732) (Ref __iruid_734))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_736)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_736)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (GetTupleElement 0 (Ref __iruid_737))) (__n_called (GetTupleElement 1 (Ref __iruid_737))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_732)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.158 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 59: (Coalesce (Let __iruid_750 (ToArray (StreamMap __iruid_751 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_751)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_751))))))) (InsertFields (Let __iruid_752 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_753 (StreamFilter __iruid_754 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_750)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_750) (Ref __iruid_754))))) (Let __iruid_755 (GetField __gt (ArrayRef -1 (Ref __iruid_750) (Ref __iruid_753))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_755)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_755)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (GetTupleElement 0 (Ref __iruid_752))) (__n_called (GetTupleElement 1 (Ref __iruid_752))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_750)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.243 : INFO: encoder cache miss (6 hits, 6 misses, 0.500) 2023-04-22 21:09:48.244 : INFO: instruction count: 3: __C578HailClassLoaderContainer. 2023-04-22 21:09:48.244 : INFO: instruction count: 3: __C578HailClassLoaderContainer. 2023-04-22 21:09:48.244 : INFO: instruction count: 3: __C580FSContainer. 2023-04-22 21:09:48.244 : INFO: instruction count: 3: __C580FSContainer. 2023-04-22 21:09:48.245 : INFO: instruction count: 3: __C582etypeEncode. 2023-04-22 21:09:48.245 : INFO: instruction count: 7: __C582etypeEncode.apply 2023-04-22 21:09:48.245 : INFO: instruction count: 13: __C582etypeEncode.__m584ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryEND 2023-04-22 21:09:48.245 : INFO: instruction count: 16: __C582etypeEncode.__m585ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:09:48.247 MemoryStore: INFO: Block broadcast_15 stored as values in memory (estimated size 152.0 B, free 28.7 GiB) 2023-04-22 21:09:48.271 MemoryStore: INFO: Block broadcast_15_piece0 stored as bytes in memory (estimated size 113.0 B, free 28.7 GiB) 2023-04-22 21:09:48.273 BlockManagerInfo: INFO: Added broadcast_15_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 113.0 B, free: 28.8 GiB) 2023-04-22 21:09:48.275 SparkContext: INFO: Created broadcast 15 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:48.275 : INFO: instruction count: 3: __C520HailClassLoaderContainer. 2023-04-22 21:09:48.275 : INFO: instruction count: 3: __C520HailClassLoaderContainer. 2023-04-22 21:09:48.275 : INFO: instruction count: 3: __C522FSContainer. 2023-04-22 21:09:48.275 : INFO: instruction count: 3: __C522FSContainer. 2023-04-22 21:09:48.281 : INFO: instruction count: 3: __C524Compiled. 2023-04-22 21:09:48.281 : INFO: instruction count: 230: __C524Compiled.apply 2023-04-22 21:09:48.282 : INFO: instruction count: 205: __C524Compiled.__m530split_ToArray 2023-04-22 21:09:48.282 : INFO: instruction count: 8: __C524Compiled.__m538nNonRefAlleles 2023-04-22 21:09:48.282 : INFO: instruction count: 9: __C524Compiled.__m550begin_group_0 2023-04-22 21:09:48.282 : INFO: instruction count: 17: __C524Compiled.__m551begin_group_0 2023-04-22 21:09:48.282 : INFO: instruction count: 155: __C524Compiled.__m552split_StreamFor 2023-04-22 21:09:48.282 : INFO: instruction count: 35: __C524Compiled.__m560arrayref_bounds_check 2023-04-22 21:09:48.283 : INFO: instruction count: 73: __C524Compiled.__m564begin_group_0 2023-04-22 21:09:48.283 : INFO: instruction count: 5: __C524Compiled.__m567toInt64 2023-04-22 21:09:48.283 : INFO: instruction count: 9: __C524Compiled.setPartitionIndex 2023-04-22 21:09:48.283 : INFO: instruction count: 4: __C524Compiled.addPartitionRegion 2023-04-22 21:09:48.300 : INFO: instruction count: 4: __C524Compiled.setPool 2023-04-22 21:09:48.300 : INFO: instruction count: 3: __C524Compiled.addHailClassLoader 2023-04-22 21:09:48.300 : INFO: instruction count: 3: __C524Compiled.addFS 2023-04-22 21:09:48.300 : INFO: instruction count: 4: __C524Compiled.addTaskContext 2023-04-22 21:09:48.300 : INFO: instruction count: 45: __C524Compiled.addAndDecodeLiterals 2023-04-22 21:09:48.300 : INFO: instruction count: 27: __C524Compiled.__m576DECODE_r_struct_of_r_binaryEND_TO_SBaseStructPointer 2023-04-22 21:09:48.300 : INFO: instruction count: 31: __C524Compiled.__m577INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:09:48.302 : INFO: initial IR: IR size 45: (Coalesce (Let __iruid_696 (ToArray (StreamMap __iruid_697 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_697)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_697))))))) (InsertFields (StreamAgg __iruid_698 (StreamFilter __iruid_699 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_696)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_696) (Ref __iruid_699))))) (AggLet __iruid_700 False (GetField __gt (ArrayRef -1 (Ref __iruid_696) (Ref __iruid_698))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_700))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_700)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_696)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.330 : INFO: after optimize: compileLowerer, initial IR: IR size 45: (Coalesce (Let __iruid_766 (ToArray (StreamMap __iruid_767 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_767)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_767))))))) (InsertFields (StreamAgg __iruid_768 (StreamFilter __iruid_769 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_766)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_766) (Ref __iruid_769))))) (AggLet __iruid_770 False (GetField __gt (ArrayRef -1 (Ref __iruid_766) (Ref __iruid_768))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_770))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_770)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_766)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.332 : INFO: after InlineApplyIR: IR size 45: (Coalesce (Let __iruid_766 (ToArray (StreamMap __iruid_767 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_767)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_767))))))) (InsertFields (StreamAgg __iruid_768 (StreamFilter __iruid_769 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_766)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_766) (Ref __iruid_769))))) (AggLet __iruid_770 False (GetField __gt (ArrayRef -1 (Ref __iruid_766) (Ref __iruid_768))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_770))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_770)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_766)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.363 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 45: (Coalesce (Let __iruid_781 (ToArray (StreamMap __iruid_782 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_782)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_782))))))) (InsertFields (StreamAgg __iruid_783 (StreamFilter __iruid_784 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_781)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_781) (Ref __iruid_784))))) (AggLet __iruid_785 False (GetField __gt (ArrayRef -1 (Ref __iruid_781) (Ref __iruid_783))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_785))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_785)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_781)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.367 : INFO: after LowerArrayAggsToRunAggs: IR size 59: (Coalesce (Let __iruid_781 (ToArray (StreamMap __iruid_782 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_782)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_782))))))) (InsertFields (Let __iruid_786 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_783 (StreamFilter __iruid_784 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_781)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_781) (Ref __iruid_784))))) (Let __iruid_785 (GetField __gt (ArrayRef -1 (Ref __iruid_781) (Ref __iruid_783))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_785)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_785)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (GetTupleElement 0 (Ref __iruid_786))) (__n_called (GetTupleElement 1 (Ref __iruid_786))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_781)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.407 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 59: (Coalesce (Let __iruid_799 (ToArray (StreamMap __iruid_800 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_800)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_800))))))) (InsertFields (Let __iruid_801 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_802 (StreamFilter __iruid_803 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_799)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_799) (Ref __iruid_803))))) (Let __iruid_804 (GetField __gt (ArrayRef -1 (Ref __iruid_799) (Ref __iruid_802))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_804)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_804)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (GetTupleElement 0 (Ref __iruid_801))) (__n_called (GetTupleElement 1 (Ref __iruid_801))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_799)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.467 : INFO: encoder cache hit 2023-04-22 21:09:48.472 MemoryStore: INFO: Block broadcast_16 stored as values in memory (estimated size 152.0 B, free 28.7 GiB) 2023-04-22 21:09:48.474 MemoryStore: INFO: Block broadcast_16_piece0 stored as bytes in memory (estimated size 113.0 B, free 28.7 GiB) 2023-04-22 21:09:48.478 BlockManagerInfo: INFO: Added broadcast_16_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 113.0 B, free: 28.8 GiB) 2023-04-22 21:09:48.478 SparkContext: INFO: Created broadcast 16 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:48.479 : INFO: instruction count: 3: __C586HailClassLoaderContainer. 2023-04-22 21:09:48.479 : INFO: instruction count: 3: __C586HailClassLoaderContainer. 2023-04-22 21:09:48.479 : INFO: instruction count: 3: __C588FSContainer. 2023-04-22 21:09:48.479 : INFO: instruction count: 3: __C588FSContainer. 2023-04-22 21:09:48.485 : INFO: instruction count: 3: __C590Compiled. 2023-04-22 21:09:48.485 : INFO: instruction count: 230: __C590Compiled.apply 2023-04-22 21:09:48.486 : INFO: instruction count: 205: __C590Compiled.__m596split_ToArray 2023-04-22 21:09:48.486 : INFO: instruction count: 8: __C590Compiled.__m604nNonRefAlleles 2023-04-22 21:09:48.486 : INFO: instruction count: 9: __C590Compiled.__m616begin_group_0 2023-04-22 21:09:48.486 : INFO: instruction count: 17: __C590Compiled.__m617begin_group_0 2023-04-22 21:09:48.486 : INFO: instruction count: 155: __C590Compiled.__m618split_StreamFor 2023-04-22 21:09:48.486 : INFO: instruction count: 35: __C590Compiled.__m626arrayref_bounds_check 2023-04-22 21:09:48.487 : INFO: instruction count: 73: __C590Compiled.__m630begin_group_0 2023-04-22 21:09:48.487 : INFO: instruction count: 5: __C590Compiled.__m633toInt64 2023-04-22 21:09:48.510 : INFO: instruction count: 9: __C590Compiled.setPartitionIndex 2023-04-22 21:09:48.510 : INFO: instruction count: 4: __C590Compiled.addPartitionRegion 2023-04-22 21:09:48.510 : INFO: instruction count: 4: __C590Compiled.setPool 2023-04-22 21:09:48.510 : INFO: instruction count: 3: __C590Compiled.addHailClassLoader 2023-04-22 21:09:48.510 : INFO: instruction count: 3: __C590Compiled.addFS 2023-04-22 21:09:48.510 : INFO: instruction count: 4: __C590Compiled.addTaskContext 2023-04-22 21:09:48.510 : INFO: instruction count: 45: __C590Compiled.addAndDecodeLiterals 2023-04-22 21:09:48.510 : INFO: instruction count: 27: __C590Compiled.__m642DECODE_r_struct_of_r_binaryEND_TO_SBaseStructPointer 2023-04-22 21:09:48.511 : INFO: instruction count: 31: __C590Compiled.__m643INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:09:48.512 : INFO: initial IR: IR size 45: (Coalesce (Let __iruid_696 (ToArray (StreamMap __iruid_697 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_697)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_697))))))) (InsertFields (StreamAgg __iruid_698 (StreamFilter __iruid_699 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_696)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_696) (Ref __iruid_699))))) (AggLet __iruid_700 False (GetField __gt (ArrayRef -1 (Ref __iruid_696) (Ref __iruid_698))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_700))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_700)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_696)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.539 : INFO: after optimize: compileLowerer, initial IR: IR size 45: (Coalesce (Let __iruid_815 (ToArray (StreamMap __iruid_816 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_816)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_816))))))) (InsertFields (StreamAgg __iruid_817 (StreamFilter __iruid_818 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_815)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_815) (Ref __iruid_818))))) (AggLet __iruid_819 False (GetField __gt (ArrayRef -1 (Ref __iruid_815) (Ref __iruid_817))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_819))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_819)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_815)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.540 : INFO: after InlineApplyIR: IR size 45: (Coalesce (Let __iruid_815 (ToArray (StreamMap __iruid_816 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_816)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_816))))))) (InsertFields (StreamAgg __iruid_817 (StreamFilter __iruid_818 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_815)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_815) (Ref __iruid_818))))) (AggLet __iruid_819 False (GetField __gt (ArrayRef -1 (Ref __iruid_815) (Ref __iruid_817))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_819))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_819)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_815)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.583 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 45: (Coalesce (Let __iruid_830 (ToArray (StreamMap __iruid_831 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_831)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_831))))))) (InsertFields (StreamAgg __iruid_832 (StreamFilter __iruid_833 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_830)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_830) (Ref __iruid_833))))) (AggLet __iruid_834 False (GetField __gt (ArrayRef -1 (Ref __iruid_830) (Ref __iruid_832))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (ApplyAggOp Sum () ((Cast Int64 (Ref __iruid_834))))) (__n_called (ApplyAggOp Sum () ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_834)))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_830)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.588 : INFO: after LowerArrayAggsToRunAggs: IR size 59: (Coalesce (Let __iruid_830 (ToArray (StreamMap __iruid_831 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_831)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_831))))))) (InsertFields (Let __iruid_835 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_832 (StreamFilter __iruid_833 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_830)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_830) (Ref __iruid_833))))) (Let __iruid_834 (GetField __gt (ArrayRef -1 (Ref __iruid_830) (Ref __iruid_832))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_834)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_834)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (GetTupleElement 0 (Ref __iruid_835))) (__n_called (GetTupleElement 1 (Ref __iruid_835))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_830)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.609 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 59: (Coalesce (Let __iruid_848 (ToArray (StreamMap __iruid_849 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1))) (InsertFields (SelectFields () (Ref __iruid_849)) None (__gt (Apply 1 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_849))))))) (InsertFields (Let __iruid_850 (RunAgg ((TypedStateSig +PInt64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_851 (StreamFilter __iruid_852 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_848)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_848) (Ref __iruid_852))))) (Let __iruid_853 (GetField __gt (ArrayRef -1 (Ref __iruid_848) (Ref __iruid_851))) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Cast Int64 (Ref __iruid_853)))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 3 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_853)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PInt64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{GT:PCCall}]})) 1)) None (__AC (GetTupleElement 0 (Ref __iruid_850))) (__n_called (GetTupleElement 1 (Ref __iruid_850))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_848)))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],__AC:Int64,__n_called:Int64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__gt:Int32}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:48.667 : INFO: encoder cache hit 2023-04-22 21:09:48.676 MemoryStore: INFO: Block broadcast_17 stored as values in memory (estimated size 152.0 B, free 28.7 GiB) 2023-04-22 21:09:48.678 MemoryStore: INFO: Block broadcast_17_piece0 stored as bytes in memory (estimated size 113.0 B, free 28.7 GiB) 2023-04-22 21:09:48.680 BlockManagerInfo: INFO: Added broadcast_17_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 113.0 B, free: 28.8 GiB) 2023-04-22 21:09:48.682 SparkContext: INFO: Created broadcast 17 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:48.682 : INFO: instruction count: 3: __C644HailClassLoaderContainer. 2023-04-22 21:09:48.682 : INFO: instruction count: 3: __C644HailClassLoaderContainer. 2023-04-22 21:09:48.683 : INFO: instruction count: 3: __C646FSContainer. 2023-04-22 21:09:48.683 : INFO: instruction count: 3: __C646FSContainer. 2023-04-22 21:09:48.689 : INFO: instruction count: 3: __C648Compiled. 2023-04-22 21:09:48.689 : INFO: instruction count: 230: __C648Compiled.apply 2023-04-22 21:09:48.703 : INFO: instruction count: 205: __C648Compiled.__m654split_ToArray 2023-04-22 21:09:48.703 : INFO: instruction count: 8: __C648Compiled.__m662nNonRefAlleles 2023-04-22 21:09:48.703 : INFO: instruction count: 9: __C648Compiled.__m674begin_group_0 2023-04-22 21:09:48.703 : INFO: instruction count: 17: __C648Compiled.__m675begin_group_0 2023-04-22 21:09:48.703 : INFO: instruction count: 155: __C648Compiled.__m676split_StreamFor 2023-04-22 21:09:48.704 : INFO: instruction count: 35: __C648Compiled.__m684arrayref_bounds_check 2023-04-22 21:09:48.704 : INFO: instruction count: 73: __C648Compiled.__m688begin_group_0 2023-04-22 21:09:48.704 : INFO: instruction count: 5: __C648Compiled.__m691toInt64 2023-04-22 21:09:48.704 : INFO: instruction count: 9: __C648Compiled.setPartitionIndex 2023-04-22 21:09:48.704 : INFO: instruction count: 4: __C648Compiled.addPartitionRegion 2023-04-22 21:09:48.704 : INFO: instruction count: 4: __C648Compiled.setPool 2023-04-22 21:09:48.704 : INFO: instruction count: 3: __C648Compiled.addHailClassLoader 2023-04-22 21:09:48.704 : INFO: instruction count: 3: __C648Compiled.addFS 2023-04-22 21:09:48.704 : INFO: instruction count: 4: __C648Compiled.addTaskContext 2023-04-22 21:09:48.704 : INFO: instruction count: 45: __C648Compiled.addAndDecodeLiterals 2023-04-22 21:09:48.704 : INFO: instruction count: 27: __C648Compiled.__m700DECODE_r_struct_of_r_binaryEND_TO_SBaseStructPointer 2023-04-22 21:09:48.705 : INFO: instruction count: 31: __C648Compiled.__m701INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:09:48.833 : INFO: initial IR: IR size 17: (Coalesce (Let __iruid_701 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_701) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_701) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.838 : INFO: after optimize: compileLowerer, initial IR: IR size 17: (Coalesce (Let __iruid_856 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_856) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_856) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.838 : INFO: after InlineApplyIR: IR size 17: (Coalesce (Let __iruid_856 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_856) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_856) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.843 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 17: (Coalesce (Let __iruid_859 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_859) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_859) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.844 : INFO: after LowerArrayAggsToRunAggs: IR size 17: (Coalesce (Let __iruid_859 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_859) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_859) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.853 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 17: (Coalesce (Let __iruid_862 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_862) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_862) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.865 : INFO: instruction count: 3: __C702HailClassLoaderContainer. 2023-04-22 21:09:48.865 : INFO: instruction count: 3: __C702HailClassLoaderContainer. 2023-04-22 21:09:48.877 : INFO: instruction count: 3: __C704FSContainer. 2023-04-22 21:09:48.878 : INFO: instruction count: 3: __C704FSContainer. 2023-04-22 21:09:48.880 : INFO: instruction count: 3: __C706Compiled. 2023-04-22 21:09:48.880 : INFO: instruction count: 86: __C706Compiled.apply 2023-04-22 21:09:48.880 : INFO: instruction count: 11: __C706Compiled.__m713ord_gt 2023-04-22 21:09:48.882 : INFO: instruction count: 16: __C706Compiled.__m714ord_gtNonnull 2023-04-22 21:09:48.882 : INFO: instruction count: 11: __C706Compiled.__m715ord_lt 2023-04-22 21:09:48.883 : INFO: instruction count: 16: __C706Compiled.__m716ord_ltNonnull 2023-04-22 21:09:48.883 : INFO: instruction count: 9: __C706Compiled.setPartitionIndex 2023-04-22 21:09:48.883 : INFO: instruction count: 4: __C706Compiled.addPartitionRegion 2023-04-22 21:09:48.883 : INFO: instruction count: 4: __C706Compiled.setPool 2023-04-22 21:09:48.883 : INFO: instruction count: 3: __C706Compiled.addHailClassLoader 2023-04-22 21:09:48.883 : INFO: instruction count: 3: __C706Compiled.addFS 2023-04-22 21:09:48.883 : INFO: instruction count: 4: __C706Compiled.addTaskContext 2023-04-22 21:09:48.884 : INFO: initial IR: IR size 17: (Coalesce (Let __iruid_701 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_701) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_701) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.889 : INFO: after optimize: compileLowerer, initial IR: IR size 17: (Coalesce (Let __iruid_865 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_865) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_865) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.890 : INFO: after InlineApplyIR: IR size 17: (Coalesce (Let __iruid_865 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_865) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_865) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.895 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 17: (Coalesce (Let __iruid_868 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_868) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_868) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.909 : INFO: after LowerArrayAggsToRunAggs: IR size 17: (Coalesce (Let __iruid_868 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_868) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_868) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.915 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 17: (Coalesce (Let __iruid_871 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_871) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_871) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.927 : INFO: instruction count: 3: __C721HailClassLoaderContainer. 2023-04-22 21:09:48.927 : INFO: instruction count: 3: __C721HailClassLoaderContainer. 2023-04-22 21:09:48.927 : INFO: instruction count: 3: __C723FSContainer. 2023-04-22 21:09:48.927 : INFO: instruction count: 3: __C723FSContainer. 2023-04-22 21:09:48.954 : INFO: instruction count: 3: __C725Compiled. 2023-04-22 21:09:48.954 : INFO: instruction count: 86: __C725Compiled.apply 2023-04-22 21:09:48.954 : INFO: instruction count: 11: __C725Compiled.__m732ord_gt 2023-04-22 21:09:48.955 : INFO: instruction count: 16: __C725Compiled.__m733ord_gtNonnull 2023-04-22 21:09:48.955 : INFO: instruction count: 11: __C725Compiled.__m734ord_lt 2023-04-22 21:09:48.955 : INFO: instruction count: 16: __C725Compiled.__m735ord_ltNonnull 2023-04-22 21:09:48.955 : INFO: instruction count: 9: __C725Compiled.setPartitionIndex 2023-04-22 21:09:48.955 : INFO: instruction count: 4: __C725Compiled.addPartitionRegion 2023-04-22 21:09:48.955 : INFO: instruction count: 4: __C725Compiled.setPool 2023-04-22 21:09:48.955 : INFO: instruction count: 3: __C725Compiled.addHailClassLoader 2023-04-22 21:09:48.955 : INFO: instruction count: 3: __C725Compiled.addFS 2023-04-22 21:09:48.955 : INFO: instruction count: 4: __C725Compiled.addTaskContext 2023-04-22 21:09:48.956 : INFO: initial IR: IR size 17: (Coalesce (Let __iruid_701 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_701) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_701) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.960 : INFO: after optimize: compileLowerer, initial IR: IR size 17: (Coalesce (Let __iruid_874 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_874) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_874) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.960 : INFO: after InlineApplyIR: IR size 17: (Coalesce (Let __iruid_874 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_874) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_874) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.983 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 17: (Coalesce (Let __iruid_877 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_877) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_877) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.984 : INFO: after LowerArrayAggsToRunAggs: IR size 17: (Coalesce (Let __iruid_877 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_877) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_877) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.988 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 17: (Coalesce (Let __iruid_880 (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0)) (Coalesce (ApplySpecial 6 land () Boolean (ApplyComparisonOp GT (Ref __iruid_880) (I64 0)) (ApplyComparisonOp LT (Ref __iruid_880) (ApplyBinaryPrimOp Multiply (I64 2) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 0))))) (False))) (False)) 2023-04-22 21:09:48.993 : INFO: instruction count: 3: __C740HailClassLoaderContainer. 2023-04-22 21:09:48.993 : INFO: instruction count: 3: __C740HailClassLoaderContainer. 2023-04-22 21:09:48.994 : INFO: instruction count: 3: __C742FSContainer. 2023-04-22 21:09:48.994 : INFO: instruction count: 3: __C742FSContainer. 2023-04-22 21:09:48.995 : INFO: instruction count: 3: __C744Compiled. 2023-04-22 21:09:48.995 : INFO: instruction count: 86: __C744Compiled.apply 2023-04-22 21:09:48.996 : INFO: instruction count: 11: __C744Compiled.__m751ord_gt 2023-04-22 21:09:48.996 : INFO: instruction count: 16: __C744Compiled.__m752ord_gtNonnull 2023-04-22 21:09:48.996 : INFO: instruction count: 11: __C744Compiled.__m753ord_lt 2023-04-22 21:09:48.996 : INFO: instruction count: 16: __C744Compiled.__m754ord_ltNonnull 2023-04-22 21:09:48.996 : INFO: instruction count: 9: __C744Compiled.setPartitionIndex 2023-04-22 21:09:48.996 : INFO: instruction count: 4: __C744Compiled.addPartitionRegion 2023-04-22 21:09:48.996 : INFO: instruction count: 4: __C744Compiled.setPool 2023-04-22 21:09:48.996 : INFO: instruction count: 3: __C744Compiled.addHailClassLoader 2023-04-22 21:09:48.996 : INFO: instruction count: 3: __C744Compiled.addFS 2023-04-22 21:09:48.996 : INFO: instruction count: 4: __C744Compiled.addTaskContext 2023-04-22 21:09:48.997 : INFO: encoder cache miss (8 hits, 7 misses, 0.533) 2023-04-22 21:09:48.999 : INFO: instruction count: 3: __C759HailClassLoaderContainer. 2023-04-22 21:09:48.999 : INFO: instruction count: 3: __C759HailClassLoaderContainer. 2023-04-22 21:09:48.999 : INFO: instruction count: 3: __C761FSContainer. 2023-04-22 21:09:48.999 : INFO: instruction count: 3: __C761FSContainer. 2023-04-22 21:09:49.012 : INFO: instruction count: 3: __C763etypeEncode. 2023-04-22 21:09:49.012 : INFO: instruction count: 7: __C763etypeEncode.apply 2023-04-22 21:09:49.012 : INFO: instruction count: 25: __C763etypeEncode.__m765ENCODE_SBaseStructPointer_TO_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND 2023-04-22 21:09:49.012 : INFO: instruction count: 35: __C763etypeEncode.__m766ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_binaryEND 2023-04-22 21:09:49.012 : INFO: instruction count: 13: __C763etypeEncode.__m767ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryEND 2023-04-22 21:09:49.012 : INFO: instruction count: 16: __C763etypeEncode.__m768ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:09:49.040 : INFO: BroadcastRegionValue.broadcast: broadcasting 1 byte arrays of total size 17784 (17.37 KiB 2023-04-22 21:09:49.041 : INFO: decoder cache miss (0 hits, 1 misses, 0.000 2023-04-22 21:09:49.044 : INFO: instruction count: 3: __C769HailClassLoaderContainer. 2023-04-22 21:09:49.044 : INFO: instruction count: 3: __C769HailClassLoaderContainer. 2023-04-22 21:09:49.044 : INFO: instruction count: 3: __C771FSContainer. 2023-04-22 21:09:49.044 : INFO: instruction count: 3: __C771FSContainer. 2023-04-22 21:09:49.045 : INFO: instruction count: 3: __C773etypeDecode. 2023-04-22 21:09:49.045 : INFO: instruction count: 7: __C773etypeDecode.apply 2023-04-22 21:09:49.045 : INFO: instruction count: 27: __C773etypeDecode.__m775DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:09:49.046 : INFO: instruction count: 58: __C773etypeDecode.__m776INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:09:49.046 : INFO: instruction count: 17: __C773etypeDecode.__m777INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:09:49.046 : INFO: instruction count: 31: __C773etypeDecode.__m778INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:09:49.052 MemoryStore: INFO: Block broadcast_18 stored as values in memory (estimated size 23.4 KiB, free 28.7 GiB) 2023-04-22 21:09:49.074 MemoryStore: INFO: Block broadcast_18_piece0 stored as bytes in memory (estimated size 14.2 KiB, free 28.7 GiB) 2023-04-22 21:09:49.082 BlockManagerInfo: INFO: Added broadcast_18_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 14.2 KiB, free: 28.8 GiB) 2023-04-22 21:09:49.084 SparkContext: INFO: Created broadcast 18 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:49.200 : INFO: initial IR: IR size 46: (Coalesce (Let __iruid_702 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_703 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_702) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_702))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_704 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_705 (If (IsNA (Ref __iruid_704)) (Literal Struct{__gt:Int32} ) (Ref __iruid_704)) (InsertFields (SelectFields () (Ref __iruid_705)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_705))) (Ref __iruid_702)) (Ref __iruid_703)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.215 : INFO: after optimize: compileLowerer, initial IR: IR size 46: (Coalesce (Let __iruid_890 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_891 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_890) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_890))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_892 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_893 (If (IsNA (Ref __iruid_892)) (Literal Struct{__gt:Int32} ) (Ref __iruid_892)) (InsertFields (SelectFields () (Ref __iruid_893)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_893))) (Ref __iruid_890)) (Ref __iruid_891)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.216 : INFO: after InlineApplyIR: IR size 46: (Coalesce (Let __iruid_890 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_891 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_890) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_890))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_892 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_893 (If (IsNA (Ref __iruid_892)) (Literal Struct{__gt:Int32} ) (Ref __iruid_892)) (InsertFields (SelectFields () (Ref __iruid_893)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_893))) (Ref __iruid_890)) (Ref __iruid_891)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.273 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 46: (Coalesce (Let __iruid_902 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_903 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_902) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_902))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_904 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_905 (If (IsNA (Ref __iruid_904)) (Literal Struct{__gt:Int32} ) (Ref __iruid_904)) (InsertFields (SelectFields () (Ref __iruid_905)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_905))) (Ref __iruid_902)) (Ref __iruid_903)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.277 : INFO: after LowerArrayAggsToRunAggs: IR size 46: (Coalesce (Let __iruid_902 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_903 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_902) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_902))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_904 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_905 (If (IsNA (Ref __iruid_904)) (Literal Struct{__gt:Int32} ) (Ref __iruid_904)) (InsertFields (SelectFields () (Ref __iruid_905)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_905))) (Ref __iruid_902)) (Ref __iruid_903)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.296 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 46: (Coalesce (Let __iruid_914 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_915 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_914) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_914))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_916 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_917 (If (IsNA (Ref __iruid_916)) (Literal Struct{__gt:Int32} ) (Ref __iruid_916)) (InsertFields (SelectFields () (Ref __iruid_917)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_917))) (Ref __iruid_914)) (Ref __iruid_915)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.344 : INFO: encoder cache miss (8 hits, 8 misses, 0.500) 2023-04-22 21:09:49.346 : INFO: instruction count: 3: __C812HailClassLoaderContainer. 2023-04-22 21:09:49.346 : INFO: instruction count: 3: __C812HailClassLoaderContainer. 2023-04-22 21:09:49.346 : INFO: instruction count: 3: __C814FSContainer. 2023-04-22 21:09:49.347 : INFO: instruction count: 3: __C814FSContainer. 2023-04-22 21:09:49.347 : INFO: instruction count: 3: __C816etypeEncode. 2023-04-22 21:09:49.347 : INFO: instruction count: 7: __C816etypeEncode.apply 2023-04-22 21:09:49.347 : INFO: instruction count: 21: __C816etypeEncode.__m818ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_struct_of_o_int32ENDEND 2023-04-22 21:09:49.347 : INFO: instruction count: 16: __C816etypeEncode.__m819ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:09:49.348 : INFO: instruction count: 36: __C816etypeEncode.__m820ENCODE_SBaseStructPointer_TO_r_struct_of_o_int32END 2023-04-22 21:09:49.348 : INFO: instruction count: 4: __C816etypeEncode.__m821ENCODE_SInt32$_TO_o_int32 2023-04-22 21:09:49.350 MemoryStore: INFO: Block broadcast_19 stored as values in memory (estimated size 160.0 B, free 28.7 GiB) 2023-04-22 21:09:49.352 MemoryStore: INFO: Block broadcast_19_piece0 stored as bytes in memory (estimated size 114.0 B, free 28.7 GiB) 2023-04-22 21:09:49.355 BlockManagerInfo: INFO: Added broadcast_19_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 114.0 B, free: 28.8 GiB) 2023-04-22 21:09:49.357 SparkContext: INFO: Created broadcast 19 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:49.357 : INFO: instruction count: 3: __C779HailClassLoaderContainer. 2023-04-22 21:09:49.357 : INFO: instruction count: 3: __C779HailClassLoaderContainer. 2023-04-22 21:09:49.357 : INFO: instruction count: 3: __C781FSContainer. 2023-04-22 21:09:49.357 : INFO: instruction count: 3: __C781FSContainer. 2023-04-22 21:09:49.361 : INFO: instruction count: 3: __C783Compiled. 2023-04-22 21:09:49.361 : INFO: instruction count: 167: __C783Compiled.apply 2023-04-22 21:09:49.361 : INFO: instruction count: 8: __C783Compiled.__m790sqrt 2023-04-22 21:09:49.362 : INFO: instruction count: 215: __C783Compiled.__m792split_ToArray 2023-04-22 21:09:49.362 : INFO: instruction count: 9: __C783Compiled.setPartitionIndex 2023-04-22 21:09:49.362 : INFO: instruction count: 4: __C783Compiled.addPartitionRegion 2023-04-22 21:09:49.362 : INFO: instruction count: 4: __C783Compiled.setPool 2023-04-22 21:09:49.362 : INFO: instruction count: 3: __C783Compiled.addHailClassLoader 2023-04-22 21:09:49.362 : INFO: instruction count: 3: __C783Compiled.addFS 2023-04-22 21:09:49.362 : INFO: instruction count: 4: __C783Compiled.addTaskContext 2023-04-22 21:09:49.362 : INFO: instruction count: 54: __C783Compiled.addAndDecodeLiterals 2023-04-22 21:09:49.362 : INFO: instruction count: 36: __C783Compiled.__m808DECODE_r_struct_of_r_binaryANDr_struct_of_o_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:09:49.362 : INFO: instruction count: 31: __C783Compiled.__m809INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:09:49.362 : INFO: instruction count: 48: __C783Compiled.__m810INPLACE_DECODE_r_struct_of_o_int32END_TO_r_struct_of_o_int32END 2023-04-22 21:09:49.362 : INFO: instruction count: 10: __C783Compiled.__m811INPLACE_DECODE_o_int32_TO_o_int32 2023-04-22 21:09:49.364 : INFO: initial IR: IR size 46: (Coalesce (Let __iruid_702 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_703 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_702) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_702))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_704 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_705 (If (IsNA (Ref __iruid_704)) (Literal Struct{__gt:Int32} ) (Ref __iruid_704)) (InsertFields (SelectFields () (Ref __iruid_705)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_705))) (Ref __iruid_702)) (Ref __iruid_703)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.399 : INFO: after optimize: compileLowerer, initial IR: IR size 46: (Coalesce (Let __iruid_926 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_927 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_926) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_926))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_928 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_929 (If (IsNA (Ref __iruid_928)) (Literal Struct{__gt:Int32} ) (Ref __iruid_928)) (InsertFields (SelectFields () (Ref __iruid_929)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_929))) (Ref __iruid_926)) (Ref __iruid_927)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.400 : INFO: after InlineApplyIR: IR size 46: (Coalesce (Let __iruid_926 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_927 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_926) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_926))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_928 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_929 (If (IsNA (Ref __iruid_928)) (Literal Struct{__gt:Int32} ) (Ref __iruid_928)) (InsertFields (SelectFields () (Ref __iruid_929)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_929))) (Ref __iruid_926)) (Ref __iruid_927)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.432 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 46: (Coalesce (Let __iruid_938 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_939 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_938) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_938))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_940 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_941 (If (IsNA (Ref __iruid_940)) (Literal Struct{__gt:Int32} ) (Ref __iruid_940)) (InsertFields (SelectFields () (Ref __iruid_941)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_941))) (Ref __iruid_938)) (Ref __iruid_939)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.435 : INFO: after LowerArrayAggsToRunAggs: IR size 46: (Coalesce (Let __iruid_938 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_939 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_938) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_938))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_940 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_941 (If (IsNA (Ref __iruid_940)) (Literal Struct{__gt:Int32} ) (Ref __iruid_940)) (InsertFields (SelectFields () (Ref __iruid_941)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_941))) (Ref __iruid_938)) (Ref __iruid_939)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.450 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 46: (Coalesce (Let __iruid_950 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_951 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_950) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_950))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_952 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_953 (If (IsNA (Ref __iruid_952)) (Literal Struct{__gt:Int32} ) (Ref __iruid_952)) (InsertFields (SelectFields () (Ref __iruid_953)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_953))) (Ref __iruid_950)) (Ref __iruid_951)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.479 : INFO: encoder cache hit 2023-04-22 21:09:49.480 MemoryStore: INFO: Block broadcast_20 stored as values in memory (estimated size 160.0 B, free 28.7 GiB) 2023-04-22 21:09:49.482 MemoryStore: INFO: Block broadcast_20_piece0 stored as bytes in memory (estimated size 114.0 B, free 28.7 GiB) 2023-04-22 21:09:49.486 BlockManagerInfo: INFO: Added broadcast_20_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 114.0 B, free: 28.8 GiB) 2023-04-22 21:09:49.487 SparkContext: INFO: Created broadcast 20 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:49.487 : INFO: instruction count: 3: __C822HailClassLoaderContainer. 2023-04-22 21:09:49.487 : INFO: instruction count: 3: __C822HailClassLoaderContainer. 2023-04-22 21:09:49.487 : INFO: instruction count: 3: __C824FSContainer. 2023-04-22 21:09:49.487 : INFO: instruction count: 3: __C824FSContainer. 2023-04-22 21:09:49.491 : INFO: instruction count: 3: __C826Compiled. 2023-04-22 21:09:49.491 : INFO: instruction count: 167: __C826Compiled.apply 2023-04-22 21:09:49.491 : INFO: instruction count: 8: __C826Compiled.__m833sqrt 2023-04-22 21:09:49.491 : INFO: instruction count: 215: __C826Compiled.__m835split_ToArray 2023-04-22 21:09:49.492 : INFO: instruction count: 9: __C826Compiled.setPartitionIndex 2023-04-22 21:09:49.492 : INFO: instruction count: 4: __C826Compiled.addPartitionRegion 2023-04-22 21:09:49.492 : INFO: instruction count: 4: __C826Compiled.setPool 2023-04-22 21:09:49.492 : INFO: instruction count: 3: __C826Compiled.addHailClassLoader 2023-04-22 21:09:49.492 : INFO: instruction count: 3: __C826Compiled.addFS 2023-04-22 21:09:49.492 : INFO: instruction count: 4: __C826Compiled.addTaskContext 2023-04-22 21:09:49.492 : INFO: instruction count: 54: __C826Compiled.addAndDecodeLiterals 2023-04-22 21:09:49.492 : INFO: instruction count: 36: __C826Compiled.__m851DECODE_r_struct_of_r_binaryANDr_struct_of_o_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:09:49.492 : INFO: instruction count: 31: __C826Compiled.__m852INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:09:49.492 : INFO: instruction count: 48: __C826Compiled.__m853INPLACE_DECODE_r_struct_of_o_int32END_TO_r_struct_of_o_int32END 2023-04-22 21:09:49.492 : INFO: instruction count: 10: __C826Compiled.__m854INPLACE_DECODE_o_int32_TO_o_int32 2023-04-22 21:09:49.494 : INFO: initial IR: IR size 46: (Coalesce (Let __iruid_702 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_703 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_702) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_702))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_704 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_705 (If (IsNA (Ref __iruid_704)) (Literal Struct{__gt:Int32} ) (Ref __iruid_704)) (InsertFields (SelectFields () (Ref __iruid_705)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_705))) (Ref __iruid_702)) (Ref __iruid_703)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.554 : INFO: after optimize: compileLowerer, initial IR: IR size 46: (Coalesce (Let __iruid_962 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_963 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_962) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_962))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_964 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_965 (If (IsNA (Ref __iruid_964)) (Literal Struct{__gt:Int32} ) (Ref __iruid_964)) (InsertFields (SelectFields () (Ref __iruid_965)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_965))) (Ref __iruid_962)) (Ref __iruid_963)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.555 : INFO: after InlineApplyIR: IR size 46: (Coalesce (Let __iruid_962 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_963 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_962) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_962))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_964 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_965 (If (IsNA (Ref __iruid_964)) (Literal Struct{__gt:Int32} ) (Ref __iruid_964)) (InsertFields (SelectFields () (Ref __iruid_965)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_965))) (Ref __iruid_962)) (Ref __iruid_963)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.567 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 46: (Coalesce (Let __iruid_974 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_975 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_974) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_974))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_976 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_977 (If (IsNA (Ref __iruid_976)) (Literal Struct{__gt:Int32} ) (Ref __iruid_976)) (InsertFields (SelectFields () (Ref __iruid_977)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_977))) (Ref __iruid_974)) (Ref __iruid_975)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.579 : INFO: after LowerArrayAggsToRunAggs: IR size 46: (Coalesce (Let __iruid_974 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_975 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_974) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_974))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_976 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_977 (If (IsNA (Ref __iruid_976)) (Literal Struct{__gt:Int32} ) (Ref __iruid_976)) (InsertFields (SelectFields () (Ref __iruid_977)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_977))) (Ref __iruid_974)) (Ref __iruid_975)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.588 BlockManagerInfo: INFO: Removed broadcast_15_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 113.0 B, free: 28.8 GiB) 2023-04-22 21:09:49.619 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 46: (Coalesce (Let __iruid_986 (ApplyBinaryPrimOp FloatingPointDivide (GetField __AC (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) (GetField __n_called (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_987 (Apply 10 sqrt () Float64 (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Multiply (ApplyBinaryPrimOp Multiply (Ref __iruid_986) (ApplyBinaryPrimOp Subtract (F64 2.0) (Ref __iruid_986))) (F64 114591.0)) (F64 2.0))) (InsertFields (SelectFields (locus alleles) (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1)) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_988 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (In SingleCodeEmitParamType(true, PTypeReferenceSingleCodeType(+PCStruct{locus:+PCLocus(GRCh38),alleles:+PCArray[+PCString],__AC:+PInt64,__n_called:+PInt64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:+PCArray[+PCStruct{__gt:PInt32}]})) 1))) (Let __iruid_989 (If (IsNA (Ref __iruid_988)) (Literal Struct{__gt:Int32} ) (Ref __iruid_988)) (InsertFields (SelectFields () (Ref __iruid_989)) None (__uid_3 (Coalesce (ApplyBinaryPrimOp FloatingPointDivide (ApplyBinaryPrimOp Subtract (Cast Float64 (GetField __gt (Ref __iruid_989))) (Ref __iruid_986)) (Ref __iruid_987)) (F64 0.0))))))))))) (Die Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{__uid_3:Float64}]} -1 (Str "Internal e..."))) 2023-04-22 21:09:49.648 : INFO: encoder cache hit 2023-04-22 21:09:49.649 MemoryStore: INFO: Block broadcast_21 stored as values in memory (estimated size 160.0 B, free 28.7 GiB) 2023-04-22 21:09:49.651 MemoryStore: INFO: Block broadcast_21_piece0 stored as bytes in memory (estimated size 114.0 B, free 28.7 GiB) 2023-04-22 21:09:49.658 BlockManagerInfo: INFO: Added broadcast_21_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 114.0 B, free: 28.8 GiB) 2023-04-22 21:09:49.659 SparkContext: INFO: Created broadcast 21 from broadcast at SparkBackend.scala:354 2023-04-22 21:09:49.659 : INFO: instruction count: 3: __C855HailClassLoaderContainer. 2023-04-22 21:09:49.659 : INFO: instruction count: 3: __C855HailClassLoaderContainer. 2023-04-22 21:09:49.659 : INFO: instruction count: 3: __C857FSContainer. 2023-04-22 21:09:49.659 : INFO: instruction count: 3: __C857FSContainer. 2023-04-22 21:09:49.663 : INFO: instruction count: 3: __C859Compiled. 2023-04-22 21:09:49.663 : INFO: instruction count: 167: __C859Compiled.apply 2023-04-22 21:09:49.663 : INFO: instruction count: 8: __C859Compiled.__m866sqrt 2023-04-22 21:09:49.663 : INFO: instruction count: 215: __C859Compiled.__m868split_ToArray 2023-04-22 21:09:49.663 : INFO: instruction count: 9: __C859Compiled.setPartitionIndex 2023-04-22 21:09:49.663 : INFO: instruction count: 4: __C859Compiled.addPartitionRegion 2023-04-22 21:09:49.663 : INFO: instruction count: 4: __C859Compiled.setPool 2023-04-22 21:09:49.663 : INFO: instruction count: 3: __C859Compiled.addHailClassLoader 2023-04-22 21:09:49.663 : INFO: instruction count: 3: __C859Compiled.addFS 2023-04-22 21:09:49.664 : INFO: instruction count: 4: __C859Compiled.addTaskContext 2023-04-22 21:09:49.664 : INFO: instruction count: 54: __C859Compiled.addAndDecodeLiterals 2023-04-22 21:09:49.664 : INFO: instruction count: 36: __C859Compiled.__m884DECODE_r_struct_of_r_binaryANDr_struct_of_o_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:09:49.664 : INFO: instruction count: 31: __C859Compiled.__m885INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:09:49.664 : INFO: instruction count: 48: __C859Compiled.__m886INPLACE_DECODE_r_struct_of_o_int32END_TO_r_struct_of_o_int32END 2023-04-22 21:09:49.664 : INFO: instruction count: 10: __C859Compiled.__m887INPLACE_DECODE_o_int32_TO_o_int32 2023-04-22 21:09:50.305 SparkContext: INFO: Starting job: collect at ContextRDD.scala:176 2023-04-22 21:09:50.319 DAGScheduler: INFO: Got job 1 (collect at ContextRDD.scala:176) with 8 output partitions 2023-04-22 21:09:50.319 DAGScheduler: INFO: Final stage: ResultStage 1 (collect at ContextRDD.scala:176) 2023-04-22 21:09:50.319 DAGScheduler: INFO: Parents of final stage: List() 2023-04-22 21:09:50.319 DAGScheduler: INFO: Missing parents: List() 2023-04-22 21:09:50.320 DAGScheduler: INFO: Submitting ResultStage 1 (MapPartitionsRDD[8] at mapPartitions at ContextRDD.scala:168), which has no missing parents 2023-04-22 21:09:50.393 MemoryStore: INFO: Block broadcast_22 stored as values in memory (estimated size 862.2 KiB, free 28.7 GiB) 2023-04-22 21:09:50.399 MemoryStore: INFO: Block broadcast_22_piece0 stored as bytes in memory (estimated size 417.5 KiB, free 28.7 GiB) 2023-04-22 21:09:50.404 BlockManagerInfo: INFO: Added broadcast_22_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 417.5 KiB, free: 28.8 GiB) 2023-04-22 21:09:50.413 SparkContext: INFO: Created broadcast 22 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:09:50.413 DAGScheduler: INFO: Submitting 8 missing tasks from ResultStage 1 (MapPartitionsRDD[8] at mapPartitions at ContextRDD.scala:168) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:09:50.414 TaskSchedulerImpl: INFO: Adding task set 1.0 with 8 tasks resource profile 0 2023-04-22 21:09:50.421 TaskSetManager: INFO: Starting task 0.0 in stage 1.0 (TID 8) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4505 bytes) taskResourceAssignments Map() 2023-04-22 21:09:50.422 Executor: INFO: Running task 0.0 in stage 1.0 (TID 8) 2023-04-22 21:09:50.697 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 0.0 in stage 1.0 (TID 8) 2023-04-22 21:09:50.744 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 0.0 in stage 1.0 (TID 8) 2023-04-22 21:09:57.123 : INFO: TaskReport: stage=1, partition=0, attempt=0, peakBytes=458752, peakBytesReadable=448.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:09:57.123 : INFO: RegionPool: FREE: 448.0K allocated (384.0K blocks / 64.0K chunks), regions.size = 5, 0 current java objects, thread 61: Executor task launch worker for task 0.0 in stage 1.0 (TID 8) 2023-04-22 21:09:57.127 Executor: INFO: Finished task 0.0 in stage 1.0 (TID 8). 807 bytes result sent to driver 2023-04-22 21:09:57.128 TaskSetManager: INFO: Starting task 1.0 in stage 1.0 (TID 9) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4505 bytes) taskResourceAssignments Map() 2023-04-22 21:09:57.129 TaskSetManager: INFO: Finished task 0.0 in stage 1.0 (TID 8) in 6709 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:09:57.133 Executor: INFO: Running task 1.0 in stage 1.0 (TID 9) 2023-04-22 21:09:57.198 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 1.0 in stage 1.0 (TID 9) 2023-04-22 21:09:57.217 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 1.0 in stage 1.0 (TID 9) 2023-04-22 21:10:02.976 : INFO: TaskReport: stage=1, partition=1, attempt=0, peakBytes=458752, peakBytesReadable=448.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:10:02.976 : INFO: RegionPool: FREE: 448.0K allocated (384.0K blocks / 64.0K chunks), regions.size = 5, 0 current java objects, thread 61: Executor task launch worker for task 1.0 in stage 1.0 (TID 9) 2023-04-22 21:10:02.977 Executor: INFO: Finished task 1.0 in stage 1.0 (TID 9). 807 bytes result sent to driver 2023-04-22 21:10:02.978 TaskSetManager: INFO: Starting task 2.0 in stage 1.0 (TID 10) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:10:02.978 TaskSetManager: INFO: Finished task 1.0 in stage 1.0 (TID 9) in 5850 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:10:02.980 Executor: INFO: Running task 2.0 in stage 1.0 (TID 10) 2023-04-22 21:10:03.050 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 2.0 in stage 1.0 (TID 10) 2023-04-22 21:10:03.052 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 2.0 in stage 1.0 (TID 10) 2023-04-22 21:10:05.121 BlockManagerInfo: INFO: Removed broadcast_19_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 114.0 B, free: 28.8 GiB) 2023-04-22 21:10:08.797 : INFO: TaskReport: stage=1, partition=2, attempt=0, peakBytes=458752, peakBytesReadable=448.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:10:08.797 : INFO: RegionPool: FREE: 448.0K allocated (384.0K blocks / 64.0K chunks), regions.size = 5, 0 current java objects, thread 61: Executor task launch worker for task 2.0 in stage 1.0 (TID 10) 2023-04-22 21:10:08.801 Executor: INFO: Finished task 2.0 in stage 1.0 (TID 10). 850 bytes result sent to driver 2023-04-22 21:10:08.802 TaskSetManager: INFO: Starting task 3.0 in stage 1.0 (TID 11) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:10:08.802 TaskSetManager: INFO: Finished task 2.0 in stage 1.0 (TID 10) in 5825 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:10:08.807 Executor: INFO: Running task 3.0 in stage 1.0 (TID 11) 2023-04-22 21:10:08.852 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 3.0 in stage 1.0 (TID 11) 2023-04-22 21:10:08.856 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 3.0 in stage 1.0 (TID 11) 2023-04-22 21:10:14.578 : INFO: TaskReport: stage=1, partition=3, attempt=0, peakBytes=458752, peakBytesReadable=448.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:10:14.578 : INFO: RegionPool: FREE: 448.0K allocated (384.0K blocks / 64.0K chunks), regions.size = 5, 0 current java objects, thread 61: Executor task launch worker for task 3.0 in stage 1.0 (TID 11) 2023-04-22 21:10:14.583 Executor: INFO: Finished task 3.0 in stage 1.0 (TID 11). 807 bytes result sent to driver 2023-04-22 21:10:14.584 TaskSetManager: INFO: Starting task 4.0 in stage 1.0 (TID 12) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:10:14.584 TaskSetManager: INFO: Finished task 3.0 in stage 1.0 (TID 11) in 5783 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:10:14.587 Executor: INFO: Running task 4.0 in stage 1.0 (TID 12) 2023-04-22 21:10:14.632 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 4.0 in stage 1.0 (TID 12) 2023-04-22 21:10:14.634 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 4.0 in stage 1.0 (TID 12) 2023-04-22 21:10:20.325 : INFO: TaskReport: stage=1, partition=4, attempt=0, peakBytes=458752, peakBytesReadable=448.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:10:20.325 : INFO: RegionPool: FREE: 448.0K allocated (384.0K blocks / 64.0K chunks), regions.size = 5, 0 current java objects, thread 61: Executor task launch worker for task 4.0 in stage 1.0 (TID 12) 2023-04-22 21:10:20.326 Executor: INFO: Finished task 4.0 in stage 1.0 (TID 12). 807 bytes result sent to driver 2023-04-22 21:10:20.332 TaskSetManager: INFO: Starting task 5.0 in stage 1.0 (TID 13) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:10:20.334 TaskSetManager: INFO: Finished task 4.0 in stage 1.0 (TID 12) in 5751 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:10:20.335 Executor: INFO: Running task 5.0 in stage 1.0 (TID 13) 2023-04-22 21:10:20.389 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 5.0 in stage 1.0 (TID 13) 2023-04-22 21:10:20.391 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 5.0 in stage 1.0 (TID 13) 2023-04-22 21:10:26.105 : INFO: TaskReport: stage=1, partition=5, attempt=0, peakBytes=458752, peakBytesReadable=448.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:10:26.105 : INFO: RegionPool: FREE: 448.0K allocated (384.0K blocks / 64.0K chunks), regions.size = 5, 0 current java objects, thread 61: Executor task launch worker for task 5.0 in stage 1.0 (TID 13) 2023-04-22 21:10:26.121 Executor: INFO: Finished task 5.0 in stage 1.0 (TID 13). 807 bytes result sent to driver 2023-04-22 21:10:26.122 TaskSetManager: INFO: Starting task 6.0 in stage 1.0 (TID 14) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:10:26.122 TaskSetManager: INFO: Finished task 5.0 in stage 1.0 (TID 13) in 5790 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:10:26.123 Executor: INFO: Running task 6.0 in stage 1.0 (TID 14) 2023-04-22 21:10:26.178 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 6.0 in stage 1.0 (TID 14) 2023-04-22 21:10:26.180 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 6.0 in stage 1.0 (TID 14) 2023-04-22 21:10:31.871 : INFO: TaskReport: stage=1, partition=6, attempt=0, peakBytes=458752, peakBytesReadable=448.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:10:31.871 : INFO: RegionPool: FREE: 448.0K allocated (384.0K blocks / 64.0K chunks), regions.size = 5, 0 current java objects, thread 61: Executor task launch worker for task 6.0 in stage 1.0 (TID 14) 2023-04-22 21:10:31.871 Executor: INFO: Finished task 6.0 in stage 1.0 (TID 14). 807 bytes result sent to driver 2023-04-22 21:10:31.872 TaskSetManager: INFO: Starting task 7.0 in stage 1.0 (TID 15) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:10:31.872 TaskSetManager: INFO: Finished task 6.0 in stage 1.0 (TID 14) in 5750 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:10:31.873 Executor: INFO: Running task 7.0 in stage 1.0 (TID 15) 2023-04-22 21:10:31.943 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 7.0 in stage 1.0 (TID 15) 2023-04-22 21:10:31.945 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 61: Executor task launch worker for task 7.0 in stage 1.0 (TID 15) 2023-04-22 21:10:37.664 : INFO: TaskReport: stage=1, partition=7, attempt=0, peakBytes=458752, peakBytesReadable=448.00 KiB, chunks requested=14323, cache hits=14322 2023-04-22 21:10:37.664 : INFO: RegionPool: FREE: 448.0K allocated (384.0K blocks / 64.0K chunks), regions.size = 5, 0 current java objects, thread 61: Executor task launch worker for task 7.0 in stage 1.0 (TID 15) 2023-04-22 21:10:37.670 Executor: INFO: Finished task 7.0 in stage 1.0 (TID 15). 807 bytes result sent to driver 2023-04-22 21:10:37.671 TaskSetManager: INFO: Finished task 7.0 in stage 1.0 (TID 15) in 5799 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:10:37.672 TaskSchedulerImpl: INFO: Removed TaskSet 1.0, whose tasks have all completed, from pool 2023-04-22 21:10:37.672 DAGScheduler: INFO: ResultStage 1 (collect at ContextRDD.scala:176) finished in 47.350 s 2023-04-22 21:10:37.672 DAGScheduler: INFO: Job 1 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:10:37.672 TaskSchedulerImpl: INFO: Killing all running tasks in stage 1: Stage finished 2023-04-22 21:10:37.673 DAGScheduler: INFO: Job 1 finished: collect at ContextRDD.scala:176, took 47.367523 s 2023-04-22 21:10:37.677 MemoryStore: INFO: Block broadcast_23 stored as values in memory (estimated size 128.0 B, free 28.7 GiB) 2023-04-22 21:10:37.678 MemoryStore: INFO: Block broadcast_23_piece0 stored as bytes in memory (estimated size 70.0 B, free 28.7 GiB) 2023-04-22 21:10:37.679 BlockManagerInfo: INFO: Added broadcast_23_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 70.0 B, free: 28.8 GiB) 2023-04-22 21:10:37.687 SparkContext: INFO: Created broadcast 23 from broadcast at SparkBackend.scala:354 2023-04-22 21:10:37.997 Hail: INFO: pca: running PCA with 10 components... 2023-04-22 21:10:38.035 RowMatrix: WARN: The input data is not directly cached, which may hurt performance if its parent RDDs are also uncached. 2023-04-22 21:10:38.062 InstanceBuilder$NativeARPACK: WARN: Failed to load implementation from:dev.ludovic.netlib.arpack.JNIARPACK 2023-04-22 21:10:38.520 BlockManagerInfo: INFO: Removed broadcast_22_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 417.5 KiB, free: 28.8 GiB) 2023-04-22 21:10:38.875 MemoryStore: INFO: Block broadcast_24 stored as values in memory (estimated size 97.4 KiB, free 28.7 GiB) 2023-04-22 21:10:38.879 MemoryStore: INFO: Block broadcast_24_piece0 stored as bytes in memory (estimated size 65.3 KiB, free 28.7 GiB) 2023-04-22 21:10:38.880 BlockManagerInfo: INFO: Added broadcast_24_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 65.3 KiB, free: 28.8 GiB) 2023-04-22 21:10:38.881 SparkContext: INFO: Created broadcast 24 from broadcast at RowMatrix.scala:93 2023-04-22 21:10:39.124 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:10:39.161 DAGScheduler: INFO: Registering RDD 16 (treeAggregate at RowMatrix.scala:94) as input to shuffle 0 2023-04-22 21:10:39.165 DAGScheduler: INFO: Got job 2 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:10:39.165 DAGScheduler: INFO: Final stage: ResultStage 3 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:10:39.165 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 2) 2023-04-22 21:10:39.166 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 2) 2023-04-22 21:10:39.178 DAGScheduler: INFO: Submitting ShuffleMapStage 2 (MapPartitionsRDD[16] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:10:39.280 MemoryStore: INFO: Block broadcast_25 stored as values in memory (estimated size 868.9 KiB, free 28.7 GiB) 2023-04-22 21:10:39.296 MemoryStore: INFO: Block broadcast_25_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 28.7 GiB) 2023-04-22 21:10:39.297 BlockManagerInfo: INFO: Added broadcast_25_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 28.8 GiB) 2023-04-22 21:10:39.298 SparkContext: INFO: Created broadcast 25 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:10:39.299 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 2 (MapPartitionsRDD[16] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:10:39.299 TaskSchedulerImpl: INFO: Adding task set 2.0 with 8 tasks resource profile 0 2023-04-22 21:10:39.301 TaskSetManager: INFO: Starting task 0.0 in stage 2.0 (TID 16) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:10:39.301 Executor: INFO: Running task 0.0 in stage 2.0 (TID 16) 2023-04-22 21:10:39.514 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 0.0 in stage 2.0 (TID 16) 2023-04-22 21:10:39.518 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 61: Executor task launch worker for task 0.0 in stage 2.0 (TID 16) 2023-04-22 21:10:47.021 MemoryStore: INFO: Block rdd_12_0 stored as values in memory (estimated size 454.1 MiB, free 28.3 GiB) 2023-04-22 21:10:47.022 BlockManagerInfo: INFO: Added rdd_12_0 in memory on uger-c010.broadinstitute.org:46121 (size: 454.1 MiB, free: 28.4 GiB) 2023-04-22 21:10:47.149 BLAS: WARN: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS 2023-04-22 21:10:47.226 BLAS: WARN: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS 2023-04-22 21:10:47.619 : INFO: TaskReport: stage=2, partition=0, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:10:47.619 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 0.0 in stage 2.0 (TID 16) 2023-04-22 21:10:47.621 Executor: INFO: Finished task 0.0 in stage 2.0 (TID 16). 1153 bytes result sent to driver 2023-04-22 21:10:47.622 TaskSetManager: INFO: Starting task 1.0 in stage 2.0 (TID 17) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:10:47.636 Executor: INFO: Running task 1.0 in stage 2.0 (TID 17) 2023-04-22 21:10:47.657 TaskSetManager: INFO: Finished task 0.0 in stage 2.0 (TID 16) in 8357 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:10:48.123 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 1.0 in stage 2.0 (TID 17) 2023-04-22 21:10:48.125 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 61: Executor task launch worker for task 1.0 in stage 2.0 (TID 17) 2023-04-22 21:10:48.138 BlockManagerInfo: INFO: Removed broadcast_5_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 32.4 KiB, free: 28.4 GiB) 2023-04-22 21:10:48.215 BlockManagerInfo: INFO: Removed broadcast_6_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 378.5 KiB, free: 28.4 GiB) 2023-04-22 21:10:48.277 BlockManagerInfo: INFO: Removed broadcast_4_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 49.0 B, free: 28.4 GiB) 2023-04-22 21:10:48.324 BlockManagerInfo: INFO: Removed broadcast_3_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 217.0 B, free: 28.4 GiB) 2023-04-22 21:10:55.189 MemoryStore: INFO: Block rdd_12_1 stored as values in memory (estimated size 454.1 MiB, free 27.8 GiB) 2023-04-22 21:10:55.190 BlockManagerInfo: INFO: Added rdd_12_1 in memory on uger-c010.broadinstitute.org:46121 (size: 454.1 MiB, free: 27.9 GiB) 2023-04-22 21:10:55.400 : INFO: TaskReport: stage=2, partition=1, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:10:55.400 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 1.0 in stage 2.0 (TID 17) 2023-04-22 21:10:55.414 Executor: INFO: Finished task 1.0 in stage 2.0 (TID 17). 1153 bytes result sent to driver 2023-04-22 21:10:55.415 TaskSetManager: INFO: Starting task 2.0 in stage 2.0 (TID 18) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:10:55.416 TaskSetManager: INFO: Finished task 1.0 in stage 2.0 (TID 17) in 7795 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:10:55.423 Executor: INFO: Running task 2.0 in stage 2.0 (TID 18) 2023-04-22 21:10:55.492 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 2.0 in stage 2.0 (TID 18) 2023-04-22 21:10:55.494 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 61: Executor task launch worker for task 2.0 in stage 2.0 (TID 18) 2023-04-22 21:11:02.846 MemoryStore: INFO: Block rdd_12_2 stored as values in memory (estimated size 454.1 MiB, free 27.4 GiB) 2023-04-22 21:11:02.847 BlockManagerInfo: INFO: Added rdd_12_2 in memory on uger-c010.broadinstitute.org:46121 (size: 454.1 MiB, free: 27.5 GiB) 2023-04-22 21:11:02.999 : INFO: TaskReport: stage=2, partition=2, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:11:02.999 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 2.0 in stage 2.0 (TID 18) 2023-04-22 21:11:03.000 Executor: INFO: Finished task 2.0 in stage 2.0 (TID 18). 1153 bytes result sent to driver 2023-04-22 21:11:03.001 TaskSetManager: INFO: Starting task 3.0 in stage 2.0 (TID 19) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:03.002 TaskSetManager: INFO: Finished task 2.0 in stage 2.0 (TID 18) in 7587 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:11:03.010 Executor: INFO: Running task 3.0 in stage 2.0 (TID 19) 2023-04-22 21:11:03.088 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 3.0 in stage 2.0 (TID 19) 2023-04-22 21:11:03.090 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 61: Executor task launch worker for task 3.0 in stage 2.0 (TID 19) 2023-04-22 21:11:10.407 MemoryStore: INFO: Block rdd_12_3 stored as values in memory (estimated size 454.1 MiB, free 27.0 GiB) 2023-04-22 21:11:10.412 BlockManagerInfo: INFO: Added rdd_12_3 in memory on uger-c010.broadinstitute.org:46121 (size: 454.1 MiB, free: 27.0 GiB) 2023-04-22 21:11:10.541 : INFO: TaskReport: stage=2, partition=3, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:11:10.541 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 3.0 in stage 2.0 (TID 19) 2023-04-22 21:11:10.554 Executor: INFO: Finished task 3.0 in stage 2.0 (TID 19). 1153 bytes result sent to driver 2023-04-22 21:11:10.556 TaskSetManager: INFO: Starting task 4.0 in stage 2.0 (TID 20) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:10.556 TaskSetManager: INFO: Finished task 3.0 in stage 2.0 (TID 19) in 7555 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:11:10.563 Executor: INFO: Running task 4.0 in stage 2.0 (TID 20) 2023-04-22 21:11:10.600 : INFO: RegionPool: initialized for thread 121: Executor task launch worker for task 4.0 in stage 2.0 (TID 20) 2023-04-22 21:11:10.602 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 121: Executor task launch worker for task 4.0 in stage 2.0 (TID 20) 2023-04-22 21:11:17.380 MemoryStore: INFO: Block rdd_12_4 stored as values in memory (estimated size 454.1 MiB, free 26.5 GiB) 2023-04-22 21:11:17.383 BlockManagerInfo: INFO: Added rdd_12_4 in memory on uger-c010.broadinstitute.org:46121 (size: 454.1 MiB, free: 26.6 GiB) 2023-04-22 21:11:17.556 : INFO: TaskReport: stage=2, partition=4, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:11:17.556 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 121: Executor task launch worker for task 4.0 in stage 2.0 (TID 20) 2023-04-22 21:11:17.557 Executor: INFO: Finished task 4.0 in stage 2.0 (TID 20). 1110 bytes result sent to driver 2023-04-22 21:11:17.558 TaskSetManager: INFO: Starting task 5.0 in stage 2.0 (TID 21) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:17.558 TaskSetManager: INFO: Finished task 4.0 in stage 2.0 (TID 20) in 7003 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:11:17.562 Executor: INFO: Running task 5.0 in stage 2.0 (TID 21) 2023-04-22 21:11:17.614 : INFO: RegionPool: initialized for thread 121: Executor task launch worker for task 5.0 in stage 2.0 (TID 21) 2023-04-22 21:11:17.615 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 121: Executor task launch worker for task 5.0 in stage 2.0 (TID 21) 2023-04-22 21:11:25.428 MemoryStore: INFO: Block rdd_12_5 stored as values in memory (estimated size 454.1 MiB, free 26.1 GiB) 2023-04-22 21:11:25.429 BlockManagerInfo: INFO: Added rdd_12_5 in memory on uger-c010.broadinstitute.org:46121 (size: 454.1 MiB, free: 26.2 GiB) 2023-04-22 21:11:25.563 : INFO: TaskReport: stage=2, partition=5, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:11:25.563 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 121: Executor task launch worker for task 5.0 in stage 2.0 (TID 21) 2023-04-22 21:11:25.567 Executor: INFO: Finished task 5.0 in stage 2.0 (TID 21). 1153 bytes result sent to driver 2023-04-22 21:11:25.568 TaskSetManager: INFO: Starting task 6.0 in stage 2.0 (TID 22) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:25.569 TaskSetManager: INFO: Finished task 5.0 in stage 2.0 (TID 21) in 8012 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:11:25.571 Executor: INFO: Running task 6.0 in stage 2.0 (TID 22) 2023-04-22 21:11:25.601 : INFO: RegionPool: initialized for thread 121: Executor task launch worker for task 6.0 in stage 2.0 (TID 22) 2023-04-22 21:11:25.603 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 121: Executor task launch worker for task 6.0 in stage 2.0 (TID 22) 2023-04-22 21:11:32.385 MemoryStore: INFO: Block rdd_12_6 stored as values in memory (estimated size 454.1 MiB, free 25.6 GiB) 2023-04-22 21:11:32.387 BlockManagerInfo: INFO: Added rdd_12_6 in memory on uger-c010.broadinstitute.org:46121 (size: 454.1 MiB, free: 25.7 GiB) 2023-04-22 21:11:32.523 : INFO: TaskReport: stage=2, partition=6, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:11:32.523 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 121: Executor task launch worker for task 6.0 in stage 2.0 (TID 22) 2023-04-22 21:11:32.524 Executor: INFO: Finished task 6.0 in stage 2.0 (TID 22). 1110 bytes result sent to driver 2023-04-22 21:11:32.536 TaskSetManager: INFO: Starting task 7.0 in stage 2.0 (TID 23) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:32.537 TaskSetManager: INFO: Finished task 6.0 in stage 2.0 (TID 22) in 6969 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:11:32.566 Executor: INFO: Running task 7.0 in stage 2.0 (TID 23) 2023-04-22 21:11:32.626 : INFO: RegionPool: initialized for thread 121: Executor task launch worker for task 7.0 in stage 2.0 (TID 23) 2023-04-22 21:11:32.629 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 121: Executor task launch worker for task 7.0 in stage 2.0 (TID 23) 2023-04-22 21:11:39.562 MemoryStore: INFO: Block rdd_12_7 stored as values in memory (estimated size 454.1 MiB, free 25.2 GiB) 2023-04-22 21:11:39.563 BlockManagerInfo: INFO: Added rdd_12_7 in memory on uger-c010.broadinstitute.org:46121 (size: 454.1 MiB, free: 25.3 GiB) 2023-04-22 21:11:39.699 : INFO: TaskReport: stage=2, partition=7, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14323, cache hits=14322 2023-04-22 21:11:39.699 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 121: Executor task launch worker for task 7.0 in stage 2.0 (TID 23) 2023-04-22 21:11:39.706 Executor: INFO: Finished task 7.0 in stage 2.0 (TID 23). 1110 bytes result sent to driver 2023-04-22 21:11:39.708 TaskSetManager: INFO: Finished task 7.0 in stage 2.0 (TID 23) in 7171 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:11:39.708 TaskSchedulerImpl: INFO: Removed TaskSet 2.0, whose tasks have all completed, from pool 2023-04-22 21:11:39.708 DAGScheduler: INFO: ShuffleMapStage 2 (treeAggregate at RowMatrix.scala:94) finished in 60.527 s 2023-04-22 21:11:39.709 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:11:39.709 DAGScheduler: INFO: running: Set() 2023-04-22 21:11:39.709 DAGScheduler: INFO: waiting: Set(ResultStage 3) 2023-04-22 21:11:39.710 DAGScheduler: INFO: failed: Set() 2023-04-22 21:11:39.713 DAGScheduler: INFO: Submitting ResultStage 3 (MapPartitionsRDD[18] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:39.817 MemoryStore: INFO: Block broadcast_26 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:11:39.833 MemoryStore: INFO: Block broadcast_26_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:11:39.834 BlockManagerInfo: INFO: Added broadcast_26_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:39.836 SparkContext: INFO: Created broadcast 26 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:39.839 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 3 (MapPartitionsRDD[18] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:11:39.839 TaskSchedulerImpl: INFO: Adding task set 3.0 with 2 tasks resource profile 0 2023-04-22 21:11:39.860 TaskSetManager: INFO: Starting task 0.0 in stage 3.0 (TID 24) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:39.862 Executor: INFO: Running task 0.0 in stage 3.0 (TID 24) 2023-04-22 21:11:40.002 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:40.004 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 43 ms 2023-04-22 21:11:40.092 Executor: INFO: Finished task 0.0 in stage 3.0 (TID 24). 34646 bytes result sent to driver 2023-04-22 21:11:40.092 TaskSetManager: INFO: Starting task 1.0 in stage 3.0 (TID 25) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:40.095 TaskSetManager: INFO: Finished task 0.0 in stage 3.0 (TID 24) in 237 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:11:40.095 Executor: INFO: Running task 1.0 in stage 3.0 (TID 25) 2023-04-22 21:11:40.143 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:40.143 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:40.169 Executor: INFO: Finished task 1.0 in stage 3.0 (TID 25). 34646 bytes result sent to driver 2023-04-22 21:11:40.170 TaskSetManager: INFO: Finished task 1.0 in stage 3.0 (TID 25) in 78 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:11:40.171 TaskSchedulerImpl: INFO: Removed TaskSet 3.0, whose tasks have all completed, from pool 2023-04-22 21:11:40.171 DAGScheduler: INFO: ResultStage 3 (treeAggregate at RowMatrix.scala:94) finished in 0.439 s 2023-04-22 21:11:40.171 DAGScheduler: INFO: Job 2 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:11:40.171 TaskSchedulerImpl: INFO: Killing all running tasks in stage 3: Stage finished 2023-04-22 21:11:40.173 DAGScheduler: INFO: Job 2 finished: treeAggregate at RowMatrix.scala:94, took 61.048248 s 2023-04-22 21:11:40.178 MemoryStore: INFO: Block broadcast_27 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:40.181 MemoryStore: INFO: Block broadcast_27_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:40.183 BlockManagerInfo: INFO: Added broadcast_27_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:40.184 SparkContext: INFO: Created broadcast 27 from broadcast at RowMatrix.scala:93 2023-04-22 21:11:40.439 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:11:40.440 DAGScheduler: INFO: Registering RDD 20 (treeAggregate at RowMatrix.scala:94) as input to shuffle 1 2023-04-22 21:11:40.440 DAGScheduler: INFO: Got job 3 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:11:40.440 DAGScheduler: INFO: Final stage: ResultStage 5 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:11:40.440 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 4) 2023-04-22 21:11:40.440 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 4) 2023-04-22 21:11:40.445 DAGScheduler: INFO: Submitting ShuffleMapStage 4 (MapPartitionsRDD[20] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:41.354 MemoryStore: INFO: Block broadcast_28 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:11:41.360 MemoryStore: INFO: Block broadcast_28_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:11:41.361 BlockManagerInfo: INFO: Added broadcast_28_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:41.362 SparkContext: INFO: Created broadcast 28 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:41.362 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 4 (MapPartitionsRDD[20] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:11:41.362 TaskSchedulerImpl: INFO: Adding task set 4.0 with 8 tasks resource profile 0 2023-04-22 21:11:41.374 TaskSetManager: INFO: Starting task 0.0 in stage 4.0 (TID 26) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:41.374 Executor: INFO: Running task 0.0 in stage 4.0 (TID 26) 2023-04-22 21:11:41.409 BlockManagerInfo: INFO: Removed broadcast_26_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:41.453 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:11:41.632 Executor: INFO: Finished task 0.0 in stage 4.0 (TID 26). 1196 bytes result sent to driver 2023-04-22 21:11:41.636 TaskSetManager: INFO: Starting task 1.0 in stage 4.0 (TID 27) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:41.636 TaskSetManager: INFO: Finished task 0.0 in stage 4.0 (TID 26) in 263 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:11:41.643 Executor: INFO: Running task 1.0 in stage 4.0 (TID 27) 2023-04-22 21:11:41.684 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:11:41.820 Executor: INFO: Finished task 1.0 in stage 4.0 (TID 27). 1196 bytes result sent to driver 2023-04-22 21:11:41.824 TaskSetManager: INFO: Starting task 2.0 in stage 4.0 (TID 28) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:41.824 TaskSetManager: INFO: Finished task 1.0 in stage 4.0 (TID 27) in 188 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:11:41.826 Executor: INFO: Running task 2.0 in stage 4.0 (TID 28) 2023-04-22 21:11:41.855 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:11:42.034 Executor: INFO: Finished task 2.0 in stage 4.0 (TID 28). 1196 bytes result sent to driver 2023-04-22 21:11:42.037 TaskSetManager: INFO: Starting task 3.0 in stage 4.0 (TID 29) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:42.038 TaskSetManager: INFO: Finished task 2.0 in stage 4.0 (TID 28) in 214 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:11:42.039 Executor: INFO: Running task 3.0 in stage 4.0 (TID 29) 2023-04-22 21:11:42.086 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:11:42.217 Executor: INFO: Finished task 3.0 in stage 4.0 (TID 29). 1196 bytes result sent to driver 2023-04-22 21:11:42.218 TaskSetManager: INFO: Starting task 4.0 in stage 4.0 (TID 30) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:42.220 TaskSetManager: INFO: Finished task 3.0 in stage 4.0 (TID 29) in 183 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:11:42.222 Executor: INFO: Running task 4.0 in stage 4.0 (TID 30) 2023-04-22 21:11:42.253 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:11:42.390 Executor: INFO: Finished task 4.0 in stage 4.0 (TID 30). 1196 bytes result sent to driver 2023-04-22 21:11:42.393 TaskSetManager: INFO: Starting task 5.0 in stage 4.0 (TID 31) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:42.394 TaskSetManager: INFO: Finished task 4.0 in stage 4.0 (TID 30) in 176 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:11:42.395 Executor: INFO: Running task 5.0 in stage 4.0 (TID 31) 2023-04-22 21:11:42.425 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:11:42.604 Executor: INFO: Finished task 5.0 in stage 4.0 (TID 31). 1196 bytes result sent to driver 2023-04-22 21:11:42.604 TaskSetManager: INFO: Starting task 6.0 in stage 4.0 (TID 32) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:42.605 TaskSetManager: INFO: Finished task 5.0 in stage 4.0 (TID 31) in 212 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:11:42.605 Executor: INFO: Running task 6.0 in stage 4.0 (TID 32) 2023-04-22 21:11:42.641 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:11:42.771 Executor: INFO: Finished task 6.0 in stage 4.0 (TID 32). 1196 bytes result sent to driver 2023-04-22 21:11:42.771 TaskSetManager: INFO: Starting task 7.0 in stage 4.0 (TID 33) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:42.772 TaskSetManager: INFO: Finished task 6.0 in stage 4.0 (TID 32) in 168 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:11:42.781 Executor: INFO: Running task 7.0 in stage 4.0 (TID 33) 2023-04-22 21:11:42.818 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:11:42.963 Executor: INFO: Finished task 7.0 in stage 4.0 (TID 33). 1196 bytes result sent to driver 2023-04-22 21:11:42.965 TaskSetManager: INFO: Finished task 7.0 in stage 4.0 (TID 33) in 194 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:11:42.965 TaskSchedulerImpl: INFO: Removed TaskSet 4.0, whose tasks have all completed, from pool 2023-04-22 21:11:42.965 DAGScheduler: INFO: ShuffleMapStage 4 (treeAggregate at RowMatrix.scala:94) finished in 2.519 s 2023-04-22 21:11:42.965 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:11:42.965 DAGScheduler: INFO: running: Set() 2023-04-22 21:11:42.965 DAGScheduler: INFO: waiting: Set(ResultStage 5) 2023-04-22 21:11:42.965 DAGScheduler: INFO: failed: Set() 2023-04-22 21:11:42.966 DAGScheduler: INFO: Submitting ResultStage 5 (MapPartitionsRDD[22] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:43.035 MemoryStore: INFO: Block broadcast_29 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:11:43.041 MemoryStore: INFO: Block broadcast_29_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:11:43.042 BlockManagerInfo: INFO: Added broadcast_29_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:43.042 SparkContext: INFO: Created broadcast 29 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:43.043 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 5 (MapPartitionsRDD[22] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:11:43.043 TaskSchedulerImpl: INFO: Adding task set 5.0 with 2 tasks resource profile 0 2023-04-22 21:11:43.053 TaskSetManager: INFO: Starting task 0.0 in stage 5.0 (TID 34) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:43.053 Executor: INFO: Running task 0.0 in stage 5.0 (TID 34) 2023-04-22 21:11:43.088 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:43.088 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 6 ms 2023-04-22 21:11:43.095 Executor: INFO: Finished task 0.0 in stage 5.0 (TID 34). 34646 bytes result sent to driver 2023-04-22 21:11:43.111 TaskSetManager: INFO: Starting task 1.0 in stage 5.0 (TID 35) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:43.111 TaskSetManager: INFO: Finished task 0.0 in stage 5.0 (TID 34) in 58 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:11:43.113 Executor: INFO: Running task 1.0 in stage 5.0 (TID 35) 2023-04-22 21:11:43.172 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:43.172 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:43.188 Executor: INFO: Finished task 1.0 in stage 5.0 (TID 35). 34646 bytes result sent to driver 2023-04-22 21:11:43.190 TaskSetManager: INFO: Finished task 1.0 in stage 5.0 (TID 35) in 79 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:11:43.190 TaskSchedulerImpl: INFO: Removed TaskSet 5.0, whose tasks have all completed, from pool 2023-04-22 21:11:43.190 DAGScheduler: INFO: ResultStage 5 (treeAggregate at RowMatrix.scala:94) finished in 0.224 s 2023-04-22 21:11:43.190 DAGScheduler: INFO: Job 3 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:11:43.190 TaskSchedulerImpl: INFO: Killing all running tasks in stage 5: Stage finished 2023-04-22 21:11:43.191 DAGScheduler: INFO: Job 3 finished: treeAggregate at RowMatrix.scala:94, took 2.751889 s 2023-04-22 21:11:43.196 MemoryStore: INFO: Block broadcast_30 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:43.198 MemoryStore: INFO: Block broadcast_30_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:43.198 BlockManagerInfo: INFO: Added broadcast_30_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:43.202 SparkContext: INFO: Created broadcast 30 from broadcast at RowMatrix.scala:93 2023-04-22 21:11:43.317 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:11:43.330 DAGScheduler: INFO: Registering RDD 24 (treeAggregate at RowMatrix.scala:94) as input to shuffle 2 2023-04-22 21:11:43.331 DAGScheduler: INFO: Got job 4 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:11:43.331 DAGScheduler: INFO: Final stage: ResultStage 7 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:11:43.331 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 6) 2023-04-22 21:11:43.331 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 6) 2023-04-22 21:11:43.334 DAGScheduler: INFO: Submitting ShuffleMapStage 6 (MapPartitionsRDD[24] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:43.416 MemoryStore: INFO: Block broadcast_31 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:11:43.422 MemoryStore: INFO: Block broadcast_31_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:11:43.423 BlockManagerInfo: INFO: Added broadcast_31_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:43.424 SparkContext: INFO: Created broadcast 31 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:43.424 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 6 (MapPartitionsRDD[24] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:11:43.424 TaskSchedulerImpl: INFO: Adding task set 6.0 with 8 tasks resource profile 0 2023-04-22 21:11:43.425 TaskSetManager: INFO: Starting task 0.0 in stage 6.0 (TID 36) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:43.425 Executor: INFO: Running task 0.0 in stage 6.0 (TID 36) 2023-04-22 21:11:43.464 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:11:43.662 Executor: INFO: Finished task 0.0 in stage 6.0 (TID 36). 1196 bytes result sent to driver 2023-04-22 21:11:43.668 TaskSetManager: INFO: Starting task 1.0 in stage 6.0 (TID 37) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:43.668 TaskSetManager: INFO: Finished task 0.0 in stage 6.0 (TID 36) in 243 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:11:43.670 Executor: INFO: Running task 1.0 in stage 6.0 (TID 37) 2023-04-22 21:11:43.698 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:11:43.845 Executor: INFO: Finished task 1.0 in stage 6.0 (TID 37). 1196 bytes result sent to driver 2023-04-22 21:11:43.848 TaskSetManager: INFO: Starting task 2.0 in stage 6.0 (TID 38) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:43.858 TaskSetManager: INFO: Finished task 1.0 in stage 6.0 (TID 37) in 191 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:11:43.868 Executor: INFO: Running task 2.0 in stage 6.0 (TID 38) 2023-04-22 21:11:43.900 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:11:44.032 Executor: INFO: Finished task 2.0 in stage 6.0 (TID 38). 1196 bytes result sent to driver 2023-04-22 21:11:44.040 TaskSetManager: INFO: Starting task 3.0 in stage 6.0 (TID 39) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:44.040 TaskSetManager: INFO: Finished task 2.0 in stage 6.0 (TID 38) in 193 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:11:44.042 Executor: INFO: Running task 3.0 in stage 6.0 (TID 39) 2023-04-22 21:11:44.071 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:11:44.216 Executor: INFO: Finished task 3.0 in stage 6.0 (TID 39). 1196 bytes result sent to driver 2023-04-22 21:11:44.217 TaskSetManager: INFO: Starting task 4.0 in stage 6.0 (TID 40) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:44.217 TaskSetManager: INFO: Finished task 3.0 in stage 6.0 (TID 39) in 177 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:11:44.221 Executor: INFO: Running task 4.0 in stage 6.0 (TID 40) 2023-04-22 21:11:44.247 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:11:44.377 Executor: INFO: Finished task 4.0 in stage 6.0 (TID 40). 1196 bytes result sent to driver 2023-04-22 21:11:44.389 TaskSetManager: INFO: Starting task 5.0 in stage 6.0 (TID 41) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:44.389 Executor: INFO: Running task 5.0 in stage 6.0 (TID 41) 2023-04-22 21:11:44.390 TaskSetManager: INFO: Finished task 4.0 in stage 6.0 (TID 40) in 173 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:11:44.444 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:11:44.592 Executor: INFO: Finished task 5.0 in stage 6.0 (TID 41). 1196 bytes result sent to driver 2023-04-22 21:11:44.592 TaskSetManager: INFO: Starting task 6.0 in stage 6.0 (TID 42) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:44.593 TaskSetManager: INFO: Finished task 5.0 in stage 6.0 (TID 41) in 204 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:11:44.593 Executor: INFO: Running task 6.0 in stage 6.0 (TID 42) 2023-04-22 21:11:44.645 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:11:44.841 Executor: INFO: Finished task 6.0 in stage 6.0 (TID 42). 1196 bytes result sent to driver 2023-04-22 21:11:44.849 TaskSetManager: INFO: Starting task 7.0 in stage 6.0 (TID 43) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:44.850 TaskSetManager: INFO: Finished task 6.0 in stage 6.0 (TID 42) in 258 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:11:44.851 Executor: INFO: Running task 7.0 in stage 6.0 (TID 43) 2023-04-22 21:11:44.879 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:11:45.003 Executor: INFO: Finished task 7.0 in stage 6.0 (TID 43). 1196 bytes result sent to driver 2023-04-22 21:11:45.010 TaskSetManager: INFO: Finished task 7.0 in stage 6.0 (TID 43) in 161 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:11:45.010 TaskSchedulerImpl: INFO: Removed TaskSet 6.0, whose tasks have all completed, from pool 2023-04-22 21:11:45.011 DAGScheduler: INFO: ShuffleMapStage 6 (treeAggregate at RowMatrix.scala:94) finished in 1.676 s 2023-04-22 21:11:45.011 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:11:45.011 DAGScheduler: INFO: running: Set() 2023-04-22 21:11:45.011 DAGScheduler: INFO: waiting: Set(ResultStage 7) 2023-04-22 21:11:45.011 DAGScheduler: INFO: failed: Set() 2023-04-22 21:11:45.011 DAGScheduler: INFO: Submitting ResultStage 7 (MapPartitionsRDD[26] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:45.056 MemoryStore: INFO: Block broadcast_32 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:11:45.062 MemoryStore: INFO: Block broadcast_32_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:11:45.073 BlockManagerInfo: INFO: Added broadcast_32_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:45.073 SparkContext: INFO: Created broadcast 32 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:45.074 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 7 (MapPartitionsRDD[26] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:11:45.074 TaskSchedulerImpl: INFO: Adding task set 7.0 with 2 tasks resource profile 0 2023-04-22 21:11:45.076 TaskSetManager: INFO: Starting task 0.0 in stage 7.0 (TID 44) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:45.076 Executor: INFO: Running task 0.0 in stage 7.0 (TID 44) 2023-04-22 21:11:45.110 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:45.110 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 3 ms 2023-04-22 21:11:45.118 Executor: INFO: Finished task 0.0 in stage 7.0 (TID 44). 34646 bytes result sent to driver 2023-04-22 21:11:45.125 TaskSetManager: INFO: Starting task 1.0 in stage 7.0 (TID 45) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:45.126 TaskSetManager: INFO: Finished task 0.0 in stage 7.0 (TID 44) in 50 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:11:45.128 Executor: INFO: Running task 1.0 in stage 7.0 (TID 45) 2023-04-22 21:11:45.155 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:45.155 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:45.198 Executor: INFO: Finished task 1.0 in stage 7.0 (TID 45). 34646 bytes result sent to driver 2023-04-22 21:11:45.200 TaskSetManager: INFO: Finished task 1.0 in stage 7.0 (TID 45) in 75 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:11:45.200 TaskSchedulerImpl: INFO: Removed TaskSet 7.0, whose tasks have all completed, from pool 2023-04-22 21:11:45.200 DAGScheduler: INFO: ResultStage 7 (treeAggregate at RowMatrix.scala:94) finished in 0.188 s 2023-04-22 21:11:45.200 DAGScheduler: INFO: Job 4 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:11:45.200 TaskSchedulerImpl: INFO: Killing all running tasks in stage 7: Stage finished 2023-04-22 21:11:45.201 DAGScheduler: INFO: Job 4 finished: treeAggregate at RowMatrix.scala:94, took 1.884266 s 2023-04-22 21:11:45.207 MemoryStore: INFO: Block broadcast_33 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:45.208 MemoryStore: INFO: Block broadcast_33_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:45.209 BlockManagerInfo: INFO: Added broadcast_33_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:45.211 SparkContext: INFO: Created broadcast 33 from broadcast at RowMatrix.scala:93 2023-04-22 21:11:45.329 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:11:45.330 DAGScheduler: INFO: Registering RDD 28 (treeAggregate at RowMatrix.scala:94) as input to shuffle 3 2023-04-22 21:11:45.331 DAGScheduler: INFO: Got job 5 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:11:45.331 DAGScheduler: INFO: Final stage: ResultStage 9 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:11:45.331 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 8) 2023-04-22 21:11:45.331 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 8) 2023-04-22 21:11:45.340 DAGScheduler: INFO: Submitting ShuffleMapStage 8 (MapPartitionsRDD[28] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:45.416 MemoryStore: INFO: Block broadcast_34 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:11:45.425 MemoryStore: INFO: Block broadcast_34_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:11:45.426 BlockManagerInfo: INFO: Added broadcast_34_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:45.441 SparkContext: INFO: Created broadcast 34 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:45.441 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 8 (MapPartitionsRDD[28] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:11:45.441 TaskSchedulerImpl: INFO: Adding task set 8.0 with 8 tasks resource profile 0 2023-04-22 21:11:45.442 TaskSetManager: INFO: Starting task 0.0 in stage 8.0 (TID 46) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:45.444 Executor: INFO: Running task 0.0 in stage 8.0 (TID 46) 2023-04-22 21:11:45.472 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:11:45.731 Executor: INFO: Finished task 0.0 in stage 8.0 (TID 46). 1196 bytes result sent to driver 2023-04-22 21:11:45.733 TaskSetManager: INFO: Starting task 1.0 in stage 8.0 (TID 47) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:45.733 TaskSetManager: INFO: Finished task 0.0 in stage 8.0 (TID 46) in 291 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:11:45.736 Executor: INFO: Running task 1.0 in stage 8.0 (TID 47) 2023-04-22 21:11:45.788 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:11:46.053 Executor: INFO: Finished task 1.0 in stage 8.0 (TID 47). 1196 bytes result sent to driver 2023-04-22 21:11:46.054 TaskSetManager: INFO: Starting task 2.0 in stage 8.0 (TID 48) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:46.055 TaskSetManager: INFO: Finished task 1.0 in stage 8.0 (TID 47) in 323 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:11:46.056 Executor: INFO: Running task 2.0 in stage 8.0 (TID 48) 2023-04-22 21:11:46.096 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:11:46.371 Executor: INFO: Finished task 2.0 in stage 8.0 (TID 48). 1196 bytes result sent to driver 2023-04-22 21:11:46.372 TaskSetManager: INFO: Starting task 3.0 in stage 8.0 (TID 49) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:46.372 TaskSetManager: INFO: Finished task 2.0 in stage 8.0 (TID 48) in 318 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:11:46.376 Executor: INFO: Running task 3.0 in stage 8.0 (TID 49) 2023-04-22 21:11:46.416 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:11:46.673 Executor: INFO: Finished task 3.0 in stage 8.0 (TID 49). 1196 bytes result sent to driver 2023-04-22 21:11:46.676 TaskSetManager: INFO: Starting task 4.0 in stage 8.0 (TID 50) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:46.677 TaskSetManager: INFO: Finished task 3.0 in stage 8.0 (TID 49) in 305 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:11:46.678 Executor: INFO: Running task 4.0 in stage 8.0 (TID 50) 2023-04-22 21:11:46.741 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:11:46.963 Executor: INFO: Finished task 4.0 in stage 8.0 (TID 50). 1196 bytes result sent to driver 2023-04-22 21:11:46.964 TaskSetManager: INFO: Starting task 5.0 in stage 8.0 (TID 51) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:46.964 TaskSetManager: INFO: Finished task 4.0 in stage 8.0 (TID 50) in 288 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:11:46.981 Executor: INFO: Running task 5.0 in stage 8.0 (TID 51) 2023-04-22 21:11:47.016 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:11:47.147 Executor: INFO: Finished task 5.0 in stage 8.0 (TID 51). 1196 bytes result sent to driver 2023-04-22 21:11:47.148 TaskSetManager: INFO: Starting task 6.0 in stage 8.0 (TID 52) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:47.149 TaskSetManager: INFO: Finished task 5.0 in stage 8.0 (TID 51) in 185 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:11:47.152 Executor: INFO: Running task 6.0 in stage 8.0 (TID 52) 2023-04-22 21:11:47.179 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:11:47.301 Executor: INFO: Finished task 6.0 in stage 8.0 (TID 52). 1196 bytes result sent to driver 2023-04-22 21:11:47.310 TaskSetManager: INFO: Starting task 7.0 in stage 8.0 (TID 53) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:47.311 TaskSetManager: INFO: Finished task 6.0 in stage 8.0 (TID 52) in 163 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:11:47.325 Executor: INFO: Running task 7.0 in stage 8.0 (TID 53) 2023-04-22 21:11:47.378 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:11:47.511 Executor: INFO: Finished task 7.0 in stage 8.0 (TID 53). 1196 bytes result sent to driver 2023-04-22 21:11:47.512 TaskSetManager: INFO: Finished task 7.0 in stage 8.0 (TID 53) in 202 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:11:47.512 TaskSchedulerImpl: INFO: Removed TaskSet 8.0, whose tasks have all completed, from pool 2023-04-22 21:11:47.513 DAGScheduler: INFO: ShuffleMapStage 8 (treeAggregate at RowMatrix.scala:94) finished in 2.172 s 2023-04-22 21:11:47.513 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:11:47.513 DAGScheduler: INFO: running: Set() 2023-04-22 21:11:47.513 DAGScheduler: INFO: waiting: Set(ResultStage 9) 2023-04-22 21:11:47.513 DAGScheduler: INFO: failed: Set() 2023-04-22 21:11:47.513 DAGScheduler: INFO: Submitting ResultStage 9 (MapPartitionsRDD[30] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:47.559 MemoryStore: INFO: Block broadcast_35 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:11:47.565 MemoryStore: INFO: Block broadcast_35_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:11:47.566 BlockManagerInfo: INFO: Added broadcast_35_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:47.566 SparkContext: INFO: Created broadcast 35 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:47.567 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 9 (MapPartitionsRDD[30] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:11:47.567 TaskSchedulerImpl: INFO: Adding task set 9.0 with 2 tasks resource profile 0 2023-04-22 21:11:47.568 TaskSetManager: INFO: Starting task 0.0 in stage 9.0 (TID 54) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:47.569 Executor: INFO: Running task 0.0 in stage 9.0 (TID 54) 2023-04-22 21:11:47.616 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:47.616 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:47.636 Executor: INFO: Finished task 0.0 in stage 9.0 (TID 54). 34646 bytes result sent to driver 2023-04-22 21:11:47.638 TaskSetManager: INFO: Starting task 1.0 in stage 9.0 (TID 55) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:47.638 TaskSetManager: INFO: Finished task 0.0 in stage 9.0 (TID 54) in 71 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:11:47.647 Executor: INFO: Running task 1.0 in stage 9.0 (TID 55) 2023-04-22 21:11:47.674 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:47.674 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:47.679 Executor: INFO: Finished task 1.0 in stage 9.0 (TID 55). 34646 bytes result sent to driver 2023-04-22 21:11:47.682 TaskSetManager: INFO: Finished task 1.0 in stage 9.0 (TID 55) in 44 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:11:47.682 TaskSchedulerImpl: INFO: Removed TaskSet 9.0, whose tasks have all completed, from pool 2023-04-22 21:11:47.683 DAGScheduler: INFO: ResultStage 9 (treeAggregate at RowMatrix.scala:94) finished in 0.169 s 2023-04-22 21:11:47.683 DAGScheduler: INFO: Job 5 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:11:47.683 TaskSchedulerImpl: INFO: Killing all running tasks in stage 9: Stage finished 2023-04-22 21:11:47.684 DAGScheduler: INFO: Job 5 finished: treeAggregate at RowMatrix.scala:94, took 2.354689 s 2023-04-22 21:11:47.690 MemoryStore: INFO: Block broadcast_36 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:47.699 MemoryStore: INFO: Block broadcast_36_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:47.700 BlockManagerInfo: INFO: Added broadcast_36_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:47.701 SparkContext: INFO: Created broadcast 36 from broadcast at RowMatrix.scala:93 2023-04-22 21:11:47.838 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:11:47.839 DAGScheduler: INFO: Registering RDD 32 (treeAggregate at RowMatrix.scala:94) as input to shuffle 4 2023-04-22 21:11:47.839 DAGScheduler: INFO: Got job 6 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:11:47.839 DAGScheduler: INFO: Final stage: ResultStage 11 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:11:47.839 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 10) 2023-04-22 21:11:47.839 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 10) 2023-04-22 21:11:47.843 DAGScheduler: INFO: Submitting ShuffleMapStage 10 (MapPartitionsRDD[32] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:47.884 MemoryStore: INFO: Block broadcast_37 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:11:47.890 MemoryStore: INFO: Block broadcast_37_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:11:47.892 BlockManagerInfo: INFO: Added broadcast_37_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:47.893 SparkContext: INFO: Created broadcast 37 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:47.893 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 10 (MapPartitionsRDD[32] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:11:47.893 TaskSchedulerImpl: INFO: Adding task set 10.0 with 8 tasks resource profile 0 2023-04-22 21:11:47.895 TaskSetManager: INFO: Starting task 0.0 in stage 10.0 (TID 56) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:47.896 Executor: INFO: Running task 0.0 in stage 10.0 (TID 56) 2023-04-22 21:11:47.925 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:11:48.056 Executor: INFO: Finished task 0.0 in stage 10.0 (TID 56). 1196 bytes result sent to driver 2023-04-22 21:11:48.070 TaskSetManager: INFO: Starting task 1.0 in stage 10.0 (TID 57) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:48.071 TaskSetManager: INFO: Finished task 0.0 in stage 10.0 (TID 56) in 177 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:11:48.081 Executor: INFO: Running task 1.0 in stage 10.0 (TID 57) 2023-04-22 21:11:48.108 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:11:48.236 Executor: INFO: Finished task 1.0 in stage 10.0 (TID 57). 1196 bytes result sent to driver 2023-04-22 21:11:48.236 TaskSetManager: INFO: Starting task 2.0 in stage 10.0 (TID 58) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:48.237 TaskSetManager: INFO: Finished task 1.0 in stage 10.0 (TID 57) in 167 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:11:48.239 Executor: INFO: Running task 2.0 in stage 10.0 (TID 58) 2023-04-22 21:11:48.266 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:11:48.397 Executor: INFO: Finished task 2.0 in stage 10.0 (TID 58). 1196 bytes result sent to driver 2023-04-22 21:11:48.399 TaskSetManager: INFO: Starting task 3.0 in stage 10.0 (TID 59) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:48.409 TaskSetManager: INFO: Finished task 2.0 in stage 10.0 (TID 58) in 173 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:11:48.415 Executor: INFO: Running task 3.0 in stage 10.0 (TID 59) 2023-04-22 21:11:48.443 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:11:48.566 Executor: INFO: Finished task 3.0 in stage 10.0 (TID 59). 1196 bytes result sent to driver 2023-04-22 21:11:48.566 TaskSetManager: INFO: Starting task 4.0 in stage 10.0 (TID 60) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:48.570 TaskSetManager: INFO: Finished task 3.0 in stage 10.0 (TID 59) in 171 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:11:48.577 Executor: INFO: Running task 4.0 in stage 10.0 (TID 60) 2023-04-22 21:11:48.608 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:11:48.733 Executor: INFO: Finished task 4.0 in stage 10.0 (TID 60). 1196 bytes result sent to driver 2023-04-22 21:11:48.736 TaskSetManager: INFO: Starting task 5.0 in stage 10.0 (TID 61) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:48.736 TaskSetManager: INFO: Finished task 4.0 in stage 10.0 (TID 60) in 170 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:11:48.740 Executor: INFO: Running task 5.0 in stage 10.0 (TID 61) 2023-04-22 21:11:48.767 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:11:48.893 Executor: INFO: Finished task 5.0 in stage 10.0 (TID 61). 1196 bytes result sent to driver 2023-04-22 21:11:48.896 TaskSetManager: INFO: Starting task 6.0 in stage 10.0 (TID 62) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:48.907 TaskSetManager: INFO: Finished task 5.0 in stage 10.0 (TID 61) in 171 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:11:48.921 Executor: INFO: Running task 6.0 in stage 10.0 (TID 62) 2023-04-22 21:11:48.951 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:11:49.074 Executor: INFO: Finished task 6.0 in stage 10.0 (TID 62). 1196 bytes result sent to driver 2023-04-22 21:11:49.076 TaskSetManager: INFO: Starting task 7.0 in stage 10.0 (TID 63) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:49.076 TaskSetManager: INFO: Finished task 6.0 in stage 10.0 (TID 62) in 180 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:11:49.078 Executor: INFO: Running task 7.0 in stage 10.0 (TID 63) 2023-04-22 21:11:49.105 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:11:49.233 Executor: INFO: Finished task 7.0 in stage 10.0 (TID 63). 1196 bytes result sent to driver 2023-04-22 21:11:49.240 TaskSetManager: INFO: Finished task 7.0 in stage 10.0 (TID 63) in 164 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:11:49.240 TaskSchedulerImpl: INFO: Removed TaskSet 10.0, whose tasks have all completed, from pool 2023-04-22 21:11:49.240 DAGScheduler: INFO: ShuffleMapStage 10 (treeAggregate at RowMatrix.scala:94) finished in 1.396 s 2023-04-22 21:11:49.240 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:11:49.240 DAGScheduler: INFO: running: Set() 2023-04-22 21:11:49.240 DAGScheduler: INFO: waiting: Set(ResultStage 11) 2023-04-22 21:11:49.240 DAGScheduler: INFO: failed: Set() 2023-04-22 21:11:49.241 DAGScheduler: INFO: Submitting ResultStage 11 (MapPartitionsRDD[34] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:49.308 MemoryStore: INFO: Block broadcast_38 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:11:49.314 MemoryStore: INFO: Block broadcast_38_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:11:49.315 BlockManagerInfo: INFO: Added broadcast_38_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:49.315 SparkContext: INFO: Created broadcast 38 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:49.316 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 11 (MapPartitionsRDD[34] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:11:49.316 TaskSchedulerImpl: INFO: Adding task set 11.0 with 2 tasks resource profile 0 2023-04-22 21:11:49.317 TaskSetManager: INFO: Starting task 0.0 in stage 11.0 (TID 64) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:49.319 Executor: INFO: Running task 0.0 in stage 11.0 (TID 64) 2023-04-22 21:11:49.347 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:49.347 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:49.362 Executor: INFO: Finished task 0.0 in stage 11.0 (TID 64). 34646 bytes result sent to driver 2023-04-22 21:11:49.362 TaskSetManager: INFO: Starting task 1.0 in stage 11.0 (TID 65) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:49.362 TaskSetManager: INFO: Finished task 0.0 in stage 11.0 (TID 64) in 46 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:11:49.366 Executor: INFO: Running task 1.0 in stage 11.0 (TID 65) 2023-04-22 21:11:49.406 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:49.406 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:49.414 Executor: INFO: Finished task 1.0 in stage 11.0 (TID 65). 34646 bytes result sent to driver 2023-04-22 21:11:49.415 TaskSetManager: INFO: Finished task 1.0 in stage 11.0 (TID 65) in 53 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:11:49.415 TaskSchedulerImpl: INFO: Removed TaskSet 11.0, whose tasks have all completed, from pool 2023-04-22 21:11:49.415 DAGScheduler: INFO: ResultStage 11 (treeAggregate at RowMatrix.scala:94) finished in 0.174 s 2023-04-22 21:11:49.415 DAGScheduler: INFO: Job 6 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:11:49.415 TaskSchedulerImpl: INFO: Killing all running tasks in stage 11: Stage finished 2023-04-22 21:11:49.416 DAGScheduler: INFO: Job 6 finished: treeAggregate at RowMatrix.scala:94, took 1.577898 s 2023-04-22 21:11:49.421 MemoryStore: INFO: Block broadcast_39 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:49.423 MemoryStore: INFO: Block broadcast_39_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:49.428 BlockManagerInfo: INFO: Added broadcast_39_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:49.429 SparkContext: INFO: Created broadcast 39 from broadcast at RowMatrix.scala:93 2023-04-22 21:11:49.509 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:11:49.511 DAGScheduler: INFO: Registering RDD 36 (treeAggregate at RowMatrix.scala:94) as input to shuffle 5 2023-04-22 21:11:49.511 DAGScheduler: INFO: Got job 7 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:11:49.511 DAGScheduler: INFO: Final stage: ResultStage 13 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:11:49.511 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 12) 2023-04-22 21:11:49.511 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 12) 2023-04-22 21:11:49.524 DAGScheduler: INFO: Submitting ShuffleMapStage 12 (MapPartitionsRDD[36] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:49.565 MemoryStore: INFO: Block broadcast_40 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:11:49.571 MemoryStore: INFO: Block broadcast_40_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:11:49.575 BlockManagerInfo: INFO: Added broadcast_40_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:49.575 SparkContext: INFO: Created broadcast 40 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:49.576 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 12 (MapPartitionsRDD[36] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:11:49.576 TaskSchedulerImpl: INFO: Adding task set 12.0 with 8 tasks resource profile 0 2023-04-22 21:11:49.577 TaskSetManager: INFO: Starting task 0.0 in stage 12.0 (TID 66) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:49.579 Executor: INFO: Running task 0.0 in stage 12.0 (TID 66) 2023-04-22 21:11:49.606 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:11:49.735 Executor: INFO: Finished task 0.0 in stage 12.0 (TID 66). 1197 bytes result sent to driver 2023-04-22 21:11:49.737 TaskSetManager: INFO: Starting task 1.0 in stage 12.0 (TID 67) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:49.737 TaskSetManager: INFO: Finished task 0.0 in stage 12.0 (TID 66) in 161 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:11:49.740 Executor: INFO: Running task 1.0 in stage 12.0 (TID 67) 2023-04-22 21:11:49.767 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:11:49.896 Executor: INFO: Finished task 1.0 in stage 12.0 (TID 67). 1197 bytes result sent to driver 2023-04-22 21:11:49.897 TaskSetManager: INFO: Starting task 2.0 in stage 12.0 (TID 68) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:49.900 TaskSetManager: INFO: Finished task 1.0 in stage 12.0 (TID 67) in 163 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:11:49.902 Executor: INFO: Running task 2.0 in stage 12.0 (TID 68) 2023-04-22 21:11:49.934 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:11:50.064 Executor: INFO: Finished task 2.0 in stage 12.0 (TID 68). 1197 bytes result sent to driver 2023-04-22 21:11:50.065 TaskSetManager: INFO: Starting task 3.0 in stage 12.0 (TID 69) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:50.066 TaskSetManager: INFO: Finished task 2.0 in stage 12.0 (TID 68) in 169 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:11:50.067 Executor: INFO: Running task 3.0 in stage 12.0 (TID 69) 2023-04-22 21:11:50.094 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:11:50.223 Executor: INFO: Finished task 3.0 in stage 12.0 (TID 69). 1197 bytes result sent to driver 2023-04-22 21:11:50.224 TaskSetManager: INFO: Starting task 4.0 in stage 12.0 (TID 70) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:50.225 TaskSetManager: INFO: Finished task 3.0 in stage 12.0 (TID 69) in 160 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:11:50.229 Executor: INFO: Running task 4.0 in stage 12.0 (TID 70) 2023-04-22 21:11:50.257 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:11:50.389 Executor: INFO: Finished task 4.0 in stage 12.0 (TID 70). 1197 bytes result sent to driver 2023-04-22 21:11:50.391 TaskSetManager: INFO: Starting task 5.0 in stage 12.0 (TID 71) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:50.391 TaskSetManager: INFO: Finished task 4.0 in stage 12.0 (TID 70) in 167 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:11:50.392 Executor: INFO: Running task 5.0 in stage 12.0 (TID 71) 2023-04-22 21:11:50.420 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:11:50.594 Executor: INFO: Finished task 5.0 in stage 12.0 (TID 71). 1197 bytes result sent to driver 2023-04-22 21:11:50.597 TaskSetManager: INFO: Starting task 6.0 in stage 12.0 (TID 72) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:50.597 TaskSetManager: INFO: Finished task 5.0 in stage 12.0 (TID 71) in 207 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:11:50.601 Executor: INFO: Running task 6.0 in stage 12.0 (TID 72) 2023-04-22 21:11:50.628 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:11:50.801 Executor: INFO: Finished task 6.0 in stage 12.0 (TID 72). 1197 bytes result sent to driver 2023-04-22 21:11:50.802 TaskSetManager: INFO: Starting task 7.0 in stage 12.0 (TID 73) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:50.802 TaskSetManager: INFO: Finished task 6.0 in stage 12.0 (TID 72) in 205 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:11:50.811 Executor: INFO: Running task 7.0 in stage 12.0 (TID 73) 2023-04-22 21:11:50.838 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:11:50.966 Executor: INFO: Finished task 7.0 in stage 12.0 (TID 73). 1197 bytes result sent to driver 2023-04-22 21:11:50.967 TaskSetManager: INFO: Finished task 7.0 in stage 12.0 (TID 73) in 165 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:11:50.967 TaskSchedulerImpl: INFO: Removed TaskSet 12.0, whose tasks have all completed, from pool 2023-04-22 21:11:50.968 DAGScheduler: INFO: ShuffleMapStage 12 (treeAggregate at RowMatrix.scala:94) finished in 1.444 s 2023-04-22 21:11:50.968 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:11:50.968 DAGScheduler: INFO: running: Set() 2023-04-22 21:11:50.968 DAGScheduler: INFO: waiting: Set(ResultStage 13) 2023-04-22 21:11:50.968 DAGScheduler: INFO: failed: Set() 2023-04-22 21:11:50.968 DAGScheduler: INFO: Submitting ResultStage 13 (MapPartitionsRDD[38] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:51.014 MemoryStore: INFO: Block broadcast_41 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:11:51.020 MemoryStore: INFO: Block broadcast_41_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:11:51.022 BlockManagerInfo: INFO: Added broadcast_41_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:51.022 SparkContext: INFO: Created broadcast 41 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:51.023 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 13 (MapPartitionsRDD[38] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:11:51.023 TaskSchedulerImpl: INFO: Adding task set 13.0 with 2 tasks resource profile 0 2023-04-22 21:11:51.024 TaskSetManager: INFO: Starting task 0.0 in stage 13.0 (TID 74) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:51.024 Executor: INFO: Running task 0.0 in stage 13.0 (TID 74) 2023-04-22 21:11:51.051 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:51.052 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:51.057 Executor: INFO: Finished task 0.0 in stage 13.0 (TID 74). 34646 bytes result sent to driver 2023-04-22 21:11:51.071 TaskSetManager: INFO: Starting task 1.0 in stage 13.0 (TID 75) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:51.072 TaskSetManager: INFO: Finished task 0.0 in stage 13.0 (TID 74) in 48 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:11:51.072 Executor: INFO: Running task 1.0 in stage 13.0 (TID 75) 2023-04-22 21:11:51.114 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:51.114 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:51.127 Executor: INFO: Finished task 1.0 in stage 13.0 (TID 75). 34646 bytes result sent to driver 2023-04-22 21:11:51.130 TaskSetManager: INFO: Finished task 1.0 in stage 13.0 (TID 75) in 59 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:11:51.130 TaskSchedulerImpl: INFO: Removed TaskSet 13.0, whose tasks have all completed, from pool 2023-04-22 21:11:51.130 DAGScheduler: INFO: ResultStage 13 (treeAggregate at RowMatrix.scala:94) finished in 0.161 s 2023-04-22 21:11:51.131 DAGScheduler: INFO: Job 7 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:11:51.131 TaskSchedulerImpl: INFO: Killing all running tasks in stage 13: Stage finished 2023-04-22 21:11:51.131 DAGScheduler: INFO: Job 7 finished: treeAggregate at RowMatrix.scala:94, took 1.621928 s 2023-04-22 21:11:51.135 MemoryStore: INFO: Block broadcast_42 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:51.137 MemoryStore: INFO: Block broadcast_42_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:51.137 BlockManagerInfo: INFO: Added broadcast_42_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:51.138 SparkContext: INFO: Created broadcast 42 from broadcast at RowMatrix.scala:93 2023-04-22 21:11:51.227 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:11:51.229 DAGScheduler: INFO: Registering RDD 40 (treeAggregate at RowMatrix.scala:94) as input to shuffle 6 2023-04-22 21:11:51.229 DAGScheduler: INFO: Got job 8 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:11:51.229 DAGScheduler: INFO: Final stage: ResultStage 15 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:11:51.229 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 14) 2023-04-22 21:11:51.229 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 14) 2023-04-22 21:11:51.233 DAGScheduler: INFO: Submitting ShuffleMapStage 14 (MapPartitionsRDD[40] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:51.272 MemoryStore: INFO: Block broadcast_43 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:11:51.279 MemoryStore: INFO: Block broadcast_43_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:11:51.280 BlockManagerInfo: INFO: Added broadcast_43_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:51.281 SparkContext: INFO: Created broadcast 43 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:51.281 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 14 (MapPartitionsRDD[40] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:11:51.281 TaskSchedulerImpl: INFO: Adding task set 14.0 with 8 tasks resource profile 0 2023-04-22 21:11:51.282 TaskSetManager: INFO: Starting task 0.0 in stage 14.0 (TID 76) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:51.282 Executor: INFO: Running task 0.0 in stage 14.0 (TID 76) 2023-04-22 21:11:51.310 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:11:51.437 Executor: INFO: Finished task 0.0 in stage 14.0 (TID 76). 1197 bytes result sent to driver 2023-04-22 21:11:51.438 TaskSetManager: INFO: Starting task 1.0 in stage 14.0 (TID 77) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:51.438 TaskSetManager: INFO: Finished task 0.0 in stage 14.0 (TID 76) in 156 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:11:51.439 Executor: INFO: Running task 1.0 in stage 14.0 (TID 77) 2023-04-22 21:11:51.465 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:11:51.592 Executor: INFO: Finished task 1.0 in stage 14.0 (TID 77). 1197 bytes result sent to driver 2023-04-22 21:11:51.594 TaskSetManager: INFO: Starting task 2.0 in stage 14.0 (TID 78) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:51.594 TaskSetManager: INFO: Finished task 1.0 in stage 14.0 (TID 77) in 156 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:11:51.596 Executor: INFO: Running task 2.0 in stage 14.0 (TID 78) 2023-04-22 21:11:51.627 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:11:51.756 Executor: INFO: Finished task 2.0 in stage 14.0 (TID 78). 1197 bytes result sent to driver 2023-04-22 21:11:51.756 TaskSetManager: INFO: Starting task 3.0 in stage 14.0 (TID 79) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:51.758 TaskSetManager: INFO: Finished task 2.0 in stage 14.0 (TID 78) in 164 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:11:51.758 Executor: INFO: Running task 3.0 in stage 14.0 (TID 79) 2023-04-22 21:11:51.788 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:11:51.916 Executor: INFO: Finished task 3.0 in stage 14.0 (TID 79). 1197 bytes result sent to driver 2023-04-22 21:11:51.918 TaskSetManager: INFO: Starting task 4.0 in stage 14.0 (TID 80) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:51.919 TaskSetManager: INFO: Finished task 3.0 in stage 14.0 (TID 79) in 163 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:11:51.920 Executor: INFO: Running task 4.0 in stage 14.0 (TID 80) 2023-04-22 21:11:51.951 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:11:52.078 Executor: INFO: Finished task 4.0 in stage 14.0 (TID 80). 1197 bytes result sent to driver 2023-04-22 21:11:52.078 TaskSetManager: INFO: Starting task 5.0 in stage 14.0 (TID 81) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:52.079 TaskSetManager: INFO: Finished task 4.0 in stage 14.0 (TID 80) in 161 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:11:52.084 Executor: INFO: Running task 5.0 in stage 14.0 (TID 81) 2023-04-22 21:11:52.111 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:11:52.243 Executor: INFO: Finished task 5.0 in stage 14.0 (TID 81). 1197 bytes result sent to driver 2023-04-22 21:11:52.244 TaskSetManager: INFO: Starting task 6.0 in stage 14.0 (TID 82) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:52.244 TaskSetManager: INFO: Finished task 5.0 in stage 14.0 (TID 81) in 166 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:11:52.247 Executor: INFO: Running task 6.0 in stage 14.0 (TID 82) 2023-04-22 21:11:52.275 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:11:52.399 Executor: INFO: Finished task 6.0 in stage 14.0 (TID 82). 1197 bytes result sent to driver 2023-04-22 21:11:52.400 TaskSetManager: INFO: Starting task 7.0 in stage 14.0 (TID 83) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:52.402 TaskSetManager: INFO: Finished task 6.0 in stage 14.0 (TID 82) in 157 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:11:52.402 Executor: INFO: Running task 7.0 in stage 14.0 (TID 83) 2023-04-22 21:11:52.429 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:11:52.551 Executor: INFO: Finished task 7.0 in stage 14.0 (TID 83). 1197 bytes result sent to driver 2023-04-22 21:11:52.552 TaskSetManager: INFO: Finished task 7.0 in stage 14.0 (TID 83) in 152 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:11:52.552 TaskSchedulerImpl: INFO: Removed TaskSet 14.0, whose tasks have all completed, from pool 2023-04-22 21:11:52.553 DAGScheduler: INFO: ShuffleMapStage 14 (treeAggregate at RowMatrix.scala:94) finished in 1.319 s 2023-04-22 21:11:52.553 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:11:52.553 DAGScheduler: INFO: running: Set() 2023-04-22 21:11:52.553 DAGScheduler: INFO: waiting: Set(ResultStage 15) 2023-04-22 21:11:52.553 DAGScheduler: INFO: failed: Set() 2023-04-22 21:11:52.553 DAGScheduler: INFO: Submitting ResultStage 15 (MapPartitionsRDD[42] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:52.598 MemoryStore: INFO: Block broadcast_44 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:11:52.605 MemoryStore: INFO: Block broadcast_44_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:11:52.605 BlockManagerInfo: INFO: Added broadcast_44_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:52.606 SparkContext: INFO: Created broadcast 44 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:52.607 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 15 (MapPartitionsRDD[42] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:11:52.607 TaskSchedulerImpl: INFO: Adding task set 15.0 with 2 tasks resource profile 0 2023-04-22 21:11:52.608 TaskSetManager: INFO: Starting task 0.0 in stage 15.0 (TID 84) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:52.609 Executor: INFO: Running task 0.0 in stage 15.0 (TID 84) 2023-04-22 21:11:52.829 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:52.829 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:52.869 Executor: INFO: Finished task 0.0 in stage 15.0 (TID 84). 34689 bytes result sent to driver 2023-04-22 21:11:52.870 TaskSetManager: INFO: Starting task 1.0 in stage 15.0 (TID 85) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:52.870 TaskSetManager: INFO: Finished task 0.0 in stage 15.0 (TID 84) in 263 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:11:52.875 Executor: INFO: Running task 1.0 in stage 15.0 (TID 85) 2023-04-22 21:11:52.924 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:52.924 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:52.944 Executor: INFO: Finished task 1.0 in stage 15.0 (TID 85). 34646 bytes result sent to driver 2023-04-22 21:11:52.945 TaskSetManager: INFO: Finished task 1.0 in stage 15.0 (TID 85) in 75 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:11:52.945 TaskSchedulerImpl: INFO: Removed TaskSet 15.0, whose tasks have all completed, from pool 2023-04-22 21:11:52.945 DAGScheduler: INFO: ResultStage 15 (treeAggregate at RowMatrix.scala:94) finished in 0.391 s 2023-04-22 21:11:52.945 DAGScheduler: INFO: Job 8 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:11:52.945 TaskSchedulerImpl: INFO: Killing all running tasks in stage 15: Stage finished 2023-04-22 21:11:52.946 DAGScheduler: INFO: Job 8 finished: treeAggregate at RowMatrix.scala:94, took 1.718756 s 2023-04-22 21:11:52.950 MemoryStore: INFO: Block broadcast_45 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:52.952 MemoryStore: INFO: Block broadcast_45_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:52.952 BlockManagerInfo: INFO: Added broadcast_45_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:52.953 SparkContext: INFO: Created broadcast 45 from broadcast at RowMatrix.scala:93 2023-04-22 21:11:52.995 BlockManagerInfo: INFO: Removed broadcast_43_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.089 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:11:53.090 DAGScheduler: INFO: Registering RDD 44 (treeAggregate at RowMatrix.scala:94) as input to shuffle 7 2023-04-22 21:11:53.090 DAGScheduler: INFO: Got job 9 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:11:53.090 DAGScheduler: INFO: Final stage: ResultStage 17 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:11:53.090 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 16) 2023-04-22 21:11:53.090 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 16) 2023-04-22 21:11:53.102 DAGScheduler: INFO: Submitting ShuffleMapStage 16 (MapPartitionsRDD[44] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:53.149 BlockManagerInfo: INFO: Removed broadcast_32_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.155 MemoryStore: INFO: Block broadcast_46 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:11:53.165 MemoryStore: INFO: Block broadcast_46_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:11:53.165 BlockManagerInfo: INFO: Added broadcast_46_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.169 SparkContext: INFO: Created broadcast 46 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:53.170 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 16 (MapPartitionsRDD[44] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:11:53.170 TaskSchedulerImpl: INFO: Adding task set 16.0 with 8 tasks resource profile 0 2023-04-22 21:11:53.171 TaskSetManager: INFO: Starting task 0.0 in stage 16.0 (TID 86) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:53.171 Executor: INFO: Running task 0.0 in stage 16.0 (TID 86) 2023-04-22 21:11:53.204 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:11:53.246 BlockManagerInfo: INFO: Removed broadcast_29_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.326 BlockManagerInfo: INFO: Removed broadcast_41_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.356 Executor: INFO: Finished task 0.0 in stage 16.0 (TID 86). 1197 bytes result sent to driver 2023-04-22 21:11:53.356 TaskSetManager: INFO: Starting task 1.0 in stage 16.0 (TID 87) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:53.357 TaskSetManager: INFO: Finished task 0.0 in stage 16.0 (TID 86) in 186 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:11:53.357 Executor: INFO: Running task 1.0 in stage 16.0 (TID 87) 2023-04-22 21:11:53.391 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:11:53.468 BlockManagerInfo: INFO: Removed broadcast_31_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.547 Executor: INFO: Finished task 1.0 in stage 16.0 (TID 87). 1197 bytes result sent to driver 2023-04-22 21:11:53.547 TaskSetManager: INFO: Starting task 2.0 in stage 16.0 (TID 88) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:53.547 TaskSetManager: INFO: Finished task 1.0 in stage 16.0 (TID 87) in 191 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:11:53.548 Executor: INFO: Running task 2.0 in stage 16.0 (TID 88) 2023-04-22 21:11:53.589 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:11:53.603 BlockManagerInfo: INFO: Removed broadcast_30_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.658 BlockManagerInfo: INFO: Removed broadcast_33_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.736 BlockManagerInfo: INFO: Removed broadcast_35_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.747 Executor: INFO: Finished task 2.0 in stage 16.0 (TID 88). 1197 bytes result sent to driver 2023-04-22 21:11:53.748 TaskSetManager: INFO: Starting task 3.0 in stage 16.0 (TID 89) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:53.748 TaskSetManager: INFO: Finished task 2.0 in stage 16.0 (TID 88) in 201 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:11:53.750 Executor: INFO: Running task 3.0 in stage 16.0 (TID 89) 2023-04-22 21:11:53.777 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:11:53.804 BlockManagerInfo: INFO: Removed broadcast_28_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.878 BlockManagerInfo: INFO: Removed broadcast_36_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.939 Executor: INFO: Finished task 3.0 in stage 16.0 (TID 89). 1197 bytes result sent to driver 2023-04-22 21:11:53.940 TaskSetManager: INFO: Starting task 4.0 in stage 16.0 (TID 90) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:53.940 TaskSetManager: INFO: Finished task 3.0 in stage 16.0 (TID 89) in 193 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:11:53.941 Executor: INFO: Running task 4.0 in stage 16.0 (TID 90) 2023-04-22 21:11:53.955 BlockManagerInfo: INFO: Removed broadcast_39_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:53.976 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:11:54.031 BlockManagerInfo: INFO: Removed broadcast_40_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:54.105 BlockManagerInfo: INFO: Removed broadcast_37_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:54.142 Executor: INFO: Finished task 4.0 in stage 16.0 (TID 90). 1197 bytes result sent to driver 2023-04-22 21:11:54.142 TaskSetManager: INFO: Starting task 5.0 in stage 16.0 (TID 91) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:54.143 TaskSetManager: INFO: Finished task 4.0 in stage 16.0 (TID 90) in 203 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:11:54.144 Executor: INFO: Running task 5.0 in stage 16.0 (TID 91) 2023-04-22 21:11:54.171 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:11:54.215 BlockManagerInfo: INFO: Removed broadcast_34_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:54.266 BlockManagerInfo: INFO: Removed broadcast_38_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:54.309 Executor: INFO: Finished task 5.0 in stage 16.0 (TID 91). 1197 bytes result sent to driver 2023-04-22 21:11:54.310 TaskSetManager: INFO: Starting task 6.0 in stage 16.0 (TID 92) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:54.310 TaskSetManager: INFO: Finished task 5.0 in stage 16.0 (TID 91) in 168 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:11:54.310 Executor: INFO: Running task 6.0 in stage 16.0 (TID 92) 2023-04-22 21:11:54.339 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:11:54.476 Executor: INFO: Finished task 6.0 in stage 16.0 (TID 92). 1197 bytes result sent to driver 2023-04-22 21:11:54.476 TaskSetManager: INFO: Starting task 7.0 in stage 16.0 (TID 93) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:54.477 TaskSetManager: INFO: Finished task 6.0 in stage 16.0 (TID 92) in 167 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:11:54.477 Executor: INFO: Running task 7.0 in stage 16.0 (TID 93) 2023-04-22 21:11:54.503 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:11:54.627 Executor: INFO: Finished task 7.0 in stage 16.0 (TID 93). 1197 bytes result sent to driver 2023-04-22 21:11:54.628 TaskSetManager: INFO: Finished task 7.0 in stage 16.0 (TID 93) in 152 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:11:54.628 TaskSchedulerImpl: INFO: Removed TaskSet 16.0, whose tasks have all completed, from pool 2023-04-22 21:11:54.629 DAGScheduler: INFO: ShuffleMapStage 16 (treeAggregate at RowMatrix.scala:94) finished in 1.525 s 2023-04-22 21:11:54.629 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:11:54.629 DAGScheduler: INFO: running: Set() 2023-04-22 21:11:54.629 DAGScheduler: INFO: waiting: Set(ResultStage 17) 2023-04-22 21:11:54.629 DAGScheduler: INFO: failed: Set() 2023-04-22 21:11:54.629 DAGScheduler: INFO: Submitting ResultStage 17 (MapPartitionsRDD[46] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:54.671 MemoryStore: INFO: Block broadcast_47 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:11:54.677 MemoryStore: INFO: Block broadcast_47_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:11:54.677 BlockManagerInfo: INFO: Added broadcast_47_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:54.678 SparkContext: INFO: Created broadcast 47 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:54.678 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 17 (MapPartitionsRDD[46] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:11:54.678 TaskSchedulerImpl: INFO: Adding task set 17.0 with 2 tasks resource profile 0 2023-04-22 21:11:54.679 TaskSetManager: INFO: Starting task 0.0 in stage 17.0 (TID 94) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:54.680 Executor: INFO: Running task 0.0 in stage 17.0 (TID 94) 2023-04-22 21:11:54.711 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:54.713 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 3 ms 2023-04-22 21:11:54.721 Executor: INFO: Finished task 0.0 in stage 17.0 (TID 94). 34646 bytes result sent to driver 2023-04-22 21:11:54.723 TaskSetManager: INFO: Starting task 1.0 in stage 17.0 (TID 95) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:54.729 TaskSetManager: INFO: Finished task 0.0 in stage 17.0 (TID 94) in 50 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:11:54.730 Executor: INFO: Running task 1.0 in stage 17.0 (TID 95) 2023-04-22 21:11:54.759 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:54.759 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:54.764 Executor: INFO: Finished task 1.0 in stage 17.0 (TID 95). 34646 bytes result sent to driver 2023-04-22 21:11:54.766 TaskSetManager: INFO: Finished task 1.0 in stage 17.0 (TID 95) in 44 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:11:54.766 TaskSchedulerImpl: INFO: Removed TaskSet 17.0, whose tasks have all completed, from pool 2023-04-22 21:11:54.767 DAGScheduler: INFO: ResultStage 17 (treeAggregate at RowMatrix.scala:94) finished in 0.137 s 2023-04-22 21:11:54.767 DAGScheduler: INFO: Job 9 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:11:54.767 TaskSchedulerImpl: INFO: Killing all running tasks in stage 17: Stage finished 2023-04-22 21:11:54.768 DAGScheduler: INFO: Job 9 finished: treeAggregate at RowMatrix.scala:94, took 1.678758 s 2023-04-22 21:11:54.771 MemoryStore: INFO: Block broadcast_48 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:54.773 MemoryStore: INFO: Block broadcast_48_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:54.773 BlockManagerInfo: INFO: Added broadcast_48_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:54.775 SparkContext: INFO: Created broadcast 48 from broadcast at RowMatrix.scala:93 2023-04-22 21:11:54.845 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:11:54.854 DAGScheduler: INFO: Registering RDD 48 (treeAggregate at RowMatrix.scala:94) as input to shuffle 8 2023-04-22 21:11:54.855 DAGScheduler: INFO: Got job 10 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:11:54.855 DAGScheduler: INFO: Final stage: ResultStage 19 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:11:54.855 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 18) 2023-04-22 21:11:54.855 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 18) 2023-04-22 21:11:54.858 DAGScheduler: INFO: Submitting ShuffleMapStage 18 (MapPartitionsRDD[48] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:54.917 MemoryStore: INFO: Block broadcast_49 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:11:54.923 MemoryStore: INFO: Block broadcast_49_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:11:54.923 BlockManagerInfo: INFO: Added broadcast_49_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:54.924 SparkContext: INFO: Created broadcast 49 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:54.925 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 18 (MapPartitionsRDD[48] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:11:54.925 TaskSchedulerImpl: INFO: Adding task set 18.0 with 8 tasks resource profile 0 2023-04-22 21:11:54.926 TaskSetManager: INFO: Starting task 0.0 in stage 18.0 (TID 96) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:54.926 Executor: INFO: Running task 0.0 in stage 18.0 (TID 96) 2023-04-22 21:11:54.952 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:11:55.201 Executor: INFO: Finished task 0.0 in stage 18.0 (TID 96). 1197 bytes result sent to driver 2023-04-22 21:11:55.202 TaskSetManager: INFO: Starting task 1.0 in stage 18.0 (TID 97) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:55.202 TaskSetManager: INFO: Finished task 0.0 in stage 18.0 (TID 96) in 276 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:11:55.202 Executor: INFO: Running task 1.0 in stage 18.0 (TID 97) 2023-04-22 21:11:55.243 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:11:55.369 Executor: INFO: Finished task 1.0 in stage 18.0 (TID 97). 1197 bytes result sent to driver 2023-04-22 21:11:55.369 TaskSetManager: INFO: Starting task 2.0 in stage 18.0 (TID 98) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:55.372 TaskSetManager: INFO: Finished task 1.0 in stage 18.0 (TID 97) in 171 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:11:55.374 Executor: INFO: Running task 2.0 in stage 18.0 (TID 98) 2023-04-22 21:11:55.401 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:11:55.529 Executor: INFO: Finished task 2.0 in stage 18.0 (TID 98). 1197 bytes result sent to driver 2023-04-22 21:11:55.529 TaskSetManager: INFO: Starting task 3.0 in stage 18.0 (TID 99) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:55.530 TaskSetManager: INFO: Finished task 2.0 in stage 18.0 (TID 98) in 161 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:11:55.531 Executor: INFO: Running task 3.0 in stage 18.0 (TID 99) 2023-04-22 21:11:55.557 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:11:55.685 Executor: INFO: Finished task 3.0 in stage 18.0 (TID 99). 1197 bytes result sent to driver 2023-04-22 21:11:55.686 TaskSetManager: INFO: Starting task 4.0 in stage 18.0 (TID 100) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:55.686 TaskSetManager: INFO: Finished task 3.0 in stage 18.0 (TID 99) in 157 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:11:55.687 Executor: INFO: Running task 4.0 in stage 18.0 (TID 100) 2023-04-22 21:11:55.714 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:11:55.852 Executor: INFO: Finished task 4.0 in stage 18.0 (TID 100). 1197 bytes result sent to driver 2023-04-22 21:11:55.852 TaskSetManager: INFO: Starting task 5.0 in stage 18.0 (TID 101) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:55.852 TaskSetManager: INFO: Finished task 4.0 in stage 18.0 (TID 100) in 166 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:11:55.855 Executor: INFO: Running task 5.0 in stage 18.0 (TID 101) 2023-04-22 21:11:55.881 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:11:56.006 Executor: INFO: Finished task 5.0 in stage 18.0 (TID 101). 1197 bytes result sent to driver 2023-04-22 21:11:56.006 TaskSetManager: INFO: Starting task 6.0 in stage 18.0 (TID 102) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:56.007 TaskSetManager: INFO: Finished task 5.0 in stage 18.0 (TID 101) in 154 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:11:56.007 Executor: INFO: Running task 6.0 in stage 18.0 (TID 102) 2023-04-22 21:11:56.033 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:11:56.168 Executor: INFO: Finished task 6.0 in stage 18.0 (TID 102). 1197 bytes result sent to driver 2023-04-22 21:11:56.168 TaskSetManager: INFO: Starting task 7.0 in stage 18.0 (TID 103) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:56.168 TaskSetManager: INFO: Finished task 6.0 in stage 18.0 (TID 102) in 162 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:11:56.170 Executor: INFO: Running task 7.0 in stage 18.0 (TID 103) 2023-04-22 21:11:56.196 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:11:56.325 Executor: INFO: Finished task 7.0 in stage 18.0 (TID 103). 1197 bytes result sent to driver 2023-04-22 21:11:56.326 TaskSetManager: INFO: Finished task 7.0 in stage 18.0 (TID 103) in 158 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:11:56.326 TaskSchedulerImpl: INFO: Removed TaskSet 18.0, whose tasks have all completed, from pool 2023-04-22 21:11:56.326 DAGScheduler: INFO: ShuffleMapStage 18 (treeAggregate at RowMatrix.scala:94) finished in 1.467 s 2023-04-22 21:11:56.326 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:11:56.326 DAGScheduler: INFO: running: Set() 2023-04-22 21:11:56.326 DAGScheduler: INFO: waiting: Set(ResultStage 19) 2023-04-22 21:11:56.326 DAGScheduler: INFO: failed: Set() 2023-04-22 21:11:56.327 DAGScheduler: INFO: Submitting ResultStage 19 (MapPartitionsRDD[50] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:56.369 MemoryStore: INFO: Block broadcast_50 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:11:56.378 MemoryStore: INFO: Block broadcast_50_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:11:56.379 BlockManagerInfo: INFO: Added broadcast_50_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:56.380 SparkContext: INFO: Created broadcast 50 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:56.388 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 19 (MapPartitionsRDD[50] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:11:56.388 TaskSchedulerImpl: INFO: Adding task set 19.0 with 2 tasks resource profile 0 2023-04-22 21:11:56.389 TaskSetManager: INFO: Starting task 0.0 in stage 19.0 (TID 104) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:56.389 Executor: INFO: Running task 0.0 in stage 19.0 (TID 104) 2023-04-22 21:11:56.416 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:56.416 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:56.421 Executor: INFO: Finished task 0.0 in stage 19.0 (TID 104). 34646 bytes result sent to driver 2023-04-22 21:11:56.431 TaskSetManager: INFO: Starting task 1.0 in stage 19.0 (TID 105) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:56.431 TaskSetManager: INFO: Finished task 0.0 in stage 19.0 (TID 104) in 42 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:11:56.432 Executor: INFO: Running task 1.0 in stage 19.0 (TID 105) 2023-04-22 21:11:56.494 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:56.494 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:56.499 Executor: INFO: Finished task 1.0 in stage 19.0 (TID 105). 34646 bytes result sent to driver 2023-04-22 21:11:56.507 TaskSetManager: INFO: Finished task 1.0 in stage 19.0 (TID 105) in 76 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:11:56.507 TaskSchedulerImpl: INFO: Removed TaskSet 19.0, whose tasks have all completed, from pool 2023-04-22 21:11:56.507 DAGScheduler: INFO: ResultStage 19 (treeAggregate at RowMatrix.scala:94) finished in 0.180 s 2023-04-22 21:11:56.507 DAGScheduler: INFO: Job 10 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:11:56.507 TaskSchedulerImpl: INFO: Killing all running tasks in stage 19: Stage finished 2023-04-22 21:11:56.508 DAGScheduler: INFO: Job 10 finished: treeAggregate at RowMatrix.scala:94, took 1.663089 s 2023-04-22 21:11:56.510 MemoryStore: INFO: Block broadcast_51 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:56.512 MemoryStore: INFO: Block broadcast_51_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:56.513 BlockManagerInfo: INFO: Added broadcast_51_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:56.513 SparkContext: INFO: Created broadcast 51 from broadcast at RowMatrix.scala:93 2023-04-22 21:11:56.587 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:11:56.588 DAGScheduler: INFO: Registering RDD 52 (treeAggregate at RowMatrix.scala:94) as input to shuffle 9 2023-04-22 21:11:56.588 DAGScheduler: INFO: Got job 11 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:11:56.588 DAGScheduler: INFO: Final stage: ResultStage 21 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:11:56.588 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 20) 2023-04-22 21:11:56.588 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 20) 2023-04-22 21:11:56.592 DAGScheduler: INFO: Submitting ShuffleMapStage 20 (MapPartitionsRDD[52] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:56.630 MemoryStore: INFO: Block broadcast_52 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:11:56.636 MemoryStore: INFO: Block broadcast_52_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:11:56.636 BlockManagerInfo: INFO: Added broadcast_52_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:56.640 SparkContext: INFO: Created broadcast 52 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:56.641 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 20 (MapPartitionsRDD[52] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:11:56.641 TaskSchedulerImpl: INFO: Adding task set 20.0 with 8 tasks resource profile 0 2023-04-22 21:11:56.642 TaskSetManager: INFO: Starting task 0.0 in stage 20.0 (TID 106) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:56.642 Executor: INFO: Running task 0.0 in stage 20.0 (TID 106) 2023-04-22 21:11:56.669 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:11:56.794 Executor: INFO: Finished task 0.0 in stage 20.0 (TID 106). 1197 bytes result sent to driver 2023-04-22 21:11:56.794 TaskSetManager: INFO: Starting task 1.0 in stage 20.0 (TID 107) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:56.795 TaskSetManager: INFO: Finished task 0.0 in stage 20.0 (TID 106) in 153 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:11:56.799 Executor: INFO: Running task 1.0 in stage 20.0 (TID 107) 2023-04-22 21:11:56.827 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:11:56.953 Executor: INFO: Finished task 1.0 in stage 20.0 (TID 107). 1197 bytes result sent to driver 2023-04-22 21:11:56.953 TaskSetManager: INFO: Starting task 2.0 in stage 20.0 (TID 108) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:56.958 TaskSetManager: INFO: Finished task 1.0 in stage 20.0 (TID 107) in 164 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:11:56.959 Executor: INFO: Running task 2.0 in stage 20.0 (TID 108) 2023-04-22 21:11:56.985 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:11:57.114 Executor: INFO: Finished task 2.0 in stage 20.0 (TID 108). 1197 bytes result sent to driver 2023-04-22 21:11:57.115 TaskSetManager: INFO: Starting task 3.0 in stage 20.0 (TID 109) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:57.116 TaskSetManager: INFO: Finished task 2.0 in stage 20.0 (TID 108) in 163 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:11:57.116 Executor: INFO: Running task 3.0 in stage 20.0 (TID 109) 2023-04-22 21:11:57.142 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:11:57.275 Executor: INFO: Finished task 3.0 in stage 20.0 (TID 109). 1197 bytes result sent to driver 2023-04-22 21:11:57.275 TaskSetManager: INFO: Starting task 4.0 in stage 20.0 (TID 110) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:57.276 TaskSetManager: INFO: Finished task 3.0 in stage 20.0 (TID 109) in 161 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:11:57.280 Executor: INFO: Running task 4.0 in stage 20.0 (TID 110) 2023-04-22 21:11:57.305 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:11:57.436 Executor: INFO: Finished task 4.0 in stage 20.0 (TID 110). 1197 bytes result sent to driver 2023-04-22 21:11:57.437 TaskSetManager: INFO: Starting task 5.0 in stage 20.0 (TID 111) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:57.438 TaskSetManager: INFO: Finished task 4.0 in stage 20.0 (TID 110) in 162 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:11:57.439 Executor: INFO: Running task 5.0 in stage 20.0 (TID 111) 2023-04-22 21:11:57.465 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:11:57.590 Executor: INFO: Finished task 5.0 in stage 20.0 (TID 111). 1197 bytes result sent to driver 2023-04-22 21:11:57.590 TaskSetManager: INFO: Starting task 6.0 in stage 20.0 (TID 112) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:57.590 TaskSetManager: INFO: Finished task 5.0 in stage 20.0 (TID 111) in 153 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:11:57.591 Executor: INFO: Running task 6.0 in stage 20.0 (TID 112) 2023-04-22 21:11:57.617 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:11:57.745 Executor: INFO: Finished task 6.0 in stage 20.0 (TID 112). 1197 bytes result sent to driver 2023-04-22 21:11:57.748 TaskSetManager: INFO: Starting task 7.0 in stage 20.0 (TID 113) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:57.750 TaskSetManager: INFO: Finished task 6.0 in stage 20.0 (TID 112) in 160 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:11:57.751 Executor: INFO: Running task 7.0 in stage 20.0 (TID 113) 2023-04-22 21:11:57.782 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:11:57.906 Executor: INFO: Finished task 7.0 in stage 20.0 (TID 113). 1197 bytes result sent to driver 2023-04-22 21:11:57.906 TaskSetManager: INFO: Finished task 7.0 in stage 20.0 (TID 113) in 159 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:11:57.906 TaskSchedulerImpl: INFO: Removed TaskSet 20.0, whose tasks have all completed, from pool 2023-04-22 21:11:57.907 DAGScheduler: INFO: ShuffleMapStage 20 (treeAggregate at RowMatrix.scala:94) finished in 1.314 s 2023-04-22 21:11:57.907 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:11:57.907 DAGScheduler: INFO: running: Set() 2023-04-22 21:11:57.907 DAGScheduler: INFO: waiting: Set(ResultStage 21) 2023-04-22 21:11:57.907 DAGScheduler: INFO: failed: Set() 2023-04-22 21:11:57.907 DAGScheduler: INFO: Submitting ResultStage 21 (MapPartitionsRDD[54] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:57.969 MemoryStore: INFO: Block broadcast_53 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:11:57.975 MemoryStore: INFO: Block broadcast_53_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:11:57.977 BlockManagerInfo: INFO: Added broadcast_53_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:57.977 SparkContext: INFO: Created broadcast 53 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:57.978 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 21 (MapPartitionsRDD[54] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:11:57.978 TaskSchedulerImpl: INFO: Adding task set 21.0 with 2 tasks resource profile 0 2023-04-22 21:11:57.980 TaskSetManager: INFO: Starting task 0.0 in stage 21.0 (TID 114) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:57.981 Executor: INFO: Running task 0.0 in stage 21.0 (TID 114) 2023-04-22 21:11:58.028 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:58.028 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 21 ms 2023-04-22 21:11:58.033 Executor: INFO: Finished task 0.0 in stage 21.0 (TID 114). 34646 bytes result sent to driver 2023-04-22 21:11:58.034 TaskSetManager: INFO: Starting task 1.0 in stage 21.0 (TID 115) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:58.035 TaskSetManager: INFO: Finished task 0.0 in stage 21.0 (TID 114) in 55 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:11:58.035 Executor: INFO: Running task 1.0 in stage 21.0 (TID 115) 2023-04-22 21:11:58.086 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:58.086 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:58.090 Executor: INFO: Finished task 1.0 in stage 21.0 (TID 115). 34646 bytes result sent to driver 2023-04-22 21:11:58.091 TaskSetManager: INFO: Finished task 1.0 in stage 21.0 (TID 115) in 57 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:11:58.091 TaskSchedulerImpl: INFO: Removed TaskSet 21.0, whose tasks have all completed, from pool 2023-04-22 21:11:58.092 DAGScheduler: INFO: ResultStage 21 (treeAggregate at RowMatrix.scala:94) finished in 0.184 s 2023-04-22 21:11:58.092 DAGScheduler: INFO: Job 11 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:11:58.092 TaskSchedulerImpl: INFO: Killing all running tasks in stage 21: Stage finished 2023-04-22 21:11:58.092 DAGScheduler: INFO: Job 11 finished: treeAggregate at RowMatrix.scala:94, took 1.505477 s 2023-04-22 21:11:58.095 MemoryStore: INFO: Block broadcast_54 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:58.097 MemoryStore: INFO: Block broadcast_54_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:58.097 BlockManagerInfo: INFO: Added broadcast_54_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:58.100 SparkContext: INFO: Created broadcast 54 from broadcast at RowMatrix.scala:93 2023-04-22 21:11:58.225 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:11:58.226 DAGScheduler: INFO: Registering RDD 56 (treeAggregate at RowMatrix.scala:94) as input to shuffle 10 2023-04-22 21:11:58.226 DAGScheduler: INFO: Got job 12 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:11:58.226 DAGScheduler: INFO: Final stage: ResultStage 23 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:11:58.226 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 22) 2023-04-22 21:11:58.226 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 22) 2023-04-22 21:11:58.229 DAGScheduler: INFO: Submitting ShuffleMapStage 22 (MapPartitionsRDD[56] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:58.280 MemoryStore: INFO: Block broadcast_55 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:11:58.287 MemoryStore: INFO: Block broadcast_55_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:11:58.299 BlockManagerInfo: INFO: Added broadcast_55_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:11:58.301 SparkContext: INFO: Created broadcast 55 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:58.301 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 22 (MapPartitionsRDD[56] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:11:58.301 TaskSchedulerImpl: INFO: Adding task set 22.0 with 8 tasks resource profile 0 2023-04-22 21:11:58.303 TaskSetManager: INFO: Starting task 0.0 in stage 22.0 (TID 116) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:11:58.304 Executor: INFO: Running task 0.0 in stage 22.0 (TID 116) 2023-04-22 21:11:58.333 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:11:58.490 Executor: INFO: Finished task 0.0 in stage 22.0 (TID 116). 1197 bytes result sent to driver 2023-04-22 21:11:58.491 TaskSetManager: INFO: Starting task 1.0 in stage 22.0 (TID 117) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:58.492 TaskSetManager: INFO: Finished task 0.0 in stage 22.0 (TID 116) in 189 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:11:58.493 Executor: INFO: Running task 1.0 in stage 22.0 (TID 117) 2023-04-22 21:11:58.522 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:11:58.763 Executor: INFO: Finished task 1.0 in stage 22.0 (TID 117). 1197 bytes result sent to driver 2023-04-22 21:11:58.765 TaskSetManager: INFO: Starting task 2.0 in stage 22.0 (TID 118) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:58.765 TaskSetManager: INFO: Finished task 1.0 in stage 22.0 (TID 117) in 274 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:11:58.768 Executor: INFO: Running task 2.0 in stage 22.0 (TID 118) 2023-04-22 21:11:58.795 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:11:58.938 Executor: INFO: Finished task 2.0 in stage 22.0 (TID 118). 1197 bytes result sent to driver 2023-04-22 21:11:58.939 TaskSetManager: INFO: Starting task 3.0 in stage 22.0 (TID 119) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:58.940 TaskSetManager: INFO: Finished task 2.0 in stage 22.0 (TID 118) in 175 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:11:58.942 Executor: INFO: Running task 3.0 in stage 22.0 (TID 119) 2023-04-22 21:11:58.968 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:11:59.100 Executor: INFO: Finished task 3.0 in stage 22.0 (TID 119). 1197 bytes result sent to driver 2023-04-22 21:11:59.100 TaskSetManager: INFO: Starting task 4.0 in stage 22.0 (TID 120) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:59.101 TaskSetManager: INFO: Finished task 3.0 in stage 22.0 (TID 119) in 161 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:11:59.102 Executor: INFO: Running task 4.0 in stage 22.0 (TID 120) 2023-04-22 21:11:59.128 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:11:59.257 Executor: INFO: Finished task 4.0 in stage 22.0 (TID 120). 1197 bytes result sent to driver 2023-04-22 21:11:59.260 TaskSetManager: INFO: Starting task 5.0 in stage 22.0 (TID 121) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:59.262 TaskSetManager: INFO: Finished task 4.0 in stage 22.0 (TID 120) in 162 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:11:59.264 Executor: INFO: Running task 5.0 in stage 22.0 (TID 121) 2023-04-22 21:11:59.290 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:11:59.415 Executor: INFO: Finished task 5.0 in stage 22.0 (TID 121). 1197 bytes result sent to driver 2023-04-22 21:11:59.417 TaskSetManager: INFO: Starting task 6.0 in stage 22.0 (TID 122) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:59.419 TaskSetManager: INFO: Finished task 5.0 in stage 22.0 (TID 121) in 159 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:11:59.420 Executor: INFO: Running task 6.0 in stage 22.0 (TID 122) 2023-04-22 21:11:59.448 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:11:59.576 Executor: INFO: Finished task 6.0 in stage 22.0 (TID 122). 1197 bytes result sent to driver 2023-04-22 21:11:59.577 TaskSetManager: INFO: Starting task 7.0 in stage 22.0 (TID 123) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:11:59.577 TaskSetManager: INFO: Finished task 6.0 in stage 22.0 (TID 122) in 161 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:11:59.578 Executor: INFO: Running task 7.0 in stage 22.0 (TID 123) 2023-04-22 21:11:59.605 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:11:59.729 Executor: INFO: Finished task 7.0 in stage 22.0 (TID 123). 1197 bytes result sent to driver 2023-04-22 21:11:59.730 TaskSetManager: INFO: Finished task 7.0 in stage 22.0 (TID 123) in 154 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:11:59.730 TaskSchedulerImpl: INFO: Removed TaskSet 22.0, whose tasks have all completed, from pool 2023-04-22 21:11:59.730 DAGScheduler: INFO: ShuffleMapStage 22 (treeAggregate at RowMatrix.scala:94) finished in 1.500 s 2023-04-22 21:11:59.730 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:11:59.730 DAGScheduler: INFO: running: Set() 2023-04-22 21:11:59.730 DAGScheduler: INFO: waiting: Set(ResultStage 23) 2023-04-22 21:11:59.730 DAGScheduler: INFO: failed: Set() 2023-04-22 21:11:59.731 DAGScheduler: INFO: Submitting ResultStage 23 (MapPartitionsRDD[58] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:11:59.774 MemoryStore: INFO: Block broadcast_56 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:11:59.780 MemoryStore: INFO: Block broadcast_56_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:11:59.780 BlockManagerInfo: INFO: Added broadcast_56_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:11:59.782 SparkContext: INFO: Created broadcast 56 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:11:59.782 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 23 (MapPartitionsRDD[58] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:11:59.782 TaskSchedulerImpl: INFO: Adding task set 23.0 with 2 tasks resource profile 0 2023-04-22 21:11:59.783 TaskSetManager: INFO: Starting task 0.0 in stage 23.0 (TID 124) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:59.783 Executor: INFO: Running task 0.0 in stage 23.0 (TID 124) 2023-04-22 21:11:59.810 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:59.810 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:11:59.815 Executor: INFO: Finished task 0.0 in stage 23.0 (TID 124). 34646 bytes result sent to driver 2023-04-22 21:11:59.816 TaskSetManager: INFO: Starting task 1.0 in stage 23.0 (TID 125) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:11:59.817 TaskSetManager: INFO: Finished task 0.0 in stage 23.0 (TID 124) in 34 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:11:59.822 Executor: INFO: Running task 1.0 in stage 23.0 (TID 125) 2023-04-22 21:11:59.849 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:11:59.849 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 1 ms 2023-04-22 21:11:59.854 Executor: INFO: Finished task 1.0 in stage 23.0 (TID 125). 34646 bytes result sent to driver 2023-04-22 21:11:59.855 TaskSetManager: INFO: Finished task 1.0 in stage 23.0 (TID 125) in 39 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:11:59.855 TaskSchedulerImpl: INFO: Removed TaskSet 23.0, whose tasks have all completed, from pool 2023-04-22 21:11:59.858 DAGScheduler: INFO: ResultStage 23 (treeAggregate at RowMatrix.scala:94) finished in 0.127 s 2023-04-22 21:11:59.858 DAGScheduler: INFO: Job 12 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:11:59.858 TaskSchedulerImpl: INFO: Killing all running tasks in stage 23: Stage finished 2023-04-22 21:11:59.858 DAGScheduler: INFO: Job 12 finished: treeAggregate at RowMatrix.scala:94, took 1.633307 s 2023-04-22 21:11:59.861 MemoryStore: INFO: Block broadcast_57 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:59.863 MemoryStore: INFO: Block broadcast_57_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:11:59.863 BlockManagerInfo: INFO: Added broadcast_57_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:11:59.865 SparkContext: INFO: Created broadcast 57 from broadcast at RowMatrix.scala:93 2023-04-22 21:11:59.987 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:11:59.988 DAGScheduler: INFO: Registering RDD 60 (treeAggregate at RowMatrix.scala:94) as input to shuffle 11 2023-04-22 21:11:59.988 DAGScheduler: INFO: Got job 13 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:11:59.988 DAGScheduler: INFO: Final stage: ResultStage 25 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:11:59.988 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 24) 2023-04-22 21:11:59.988 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 24) 2023-04-22 21:11:59.990 DAGScheduler: INFO: Submitting ShuffleMapStage 24 (MapPartitionsRDD[60] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:00.053 MemoryStore: INFO: Block broadcast_58 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:00.073 MemoryStore: INFO: Block broadcast_58_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:00.073 BlockManagerInfo: INFO: Added broadcast_58_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:00.073 SparkContext: INFO: Created broadcast 58 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:00.074 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 24 (MapPartitionsRDD[60] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:00.074 TaskSchedulerImpl: INFO: Adding task set 24.0 with 8 tasks resource profile 0 2023-04-22 21:12:00.075 TaskSetManager: INFO: Starting task 0.0 in stage 24.0 (TID 126) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:00.076 Executor: INFO: Running task 0.0 in stage 24.0 (TID 126) 2023-04-22 21:12:00.120 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:00.337 Executor: INFO: Finished task 0.0 in stage 24.0 (TID 126). 1197 bytes result sent to driver 2023-04-22 21:12:00.338 TaskSetManager: INFO: Starting task 1.0 in stage 24.0 (TID 127) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:00.339 TaskSetManager: INFO: Finished task 0.0 in stage 24.0 (TID 126) in 264 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:00.341 Executor: INFO: Running task 1.0 in stage 24.0 (TID 127) 2023-04-22 21:12:00.370 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:00.498 Executor: INFO: Finished task 1.0 in stage 24.0 (TID 127). 1197 bytes result sent to driver 2023-04-22 21:12:00.500 TaskSetManager: INFO: Starting task 2.0 in stage 24.0 (TID 128) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:00.501 TaskSetManager: INFO: Finished task 1.0 in stage 24.0 (TID 127) in 163 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:00.503 Executor: INFO: Running task 2.0 in stage 24.0 (TID 128) 2023-04-22 21:12:00.535 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:00.662 Executor: INFO: Finished task 2.0 in stage 24.0 (TID 128). 1197 bytes result sent to driver 2023-04-22 21:12:00.663 TaskSetManager: INFO: Starting task 3.0 in stage 24.0 (TID 129) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:00.663 TaskSetManager: INFO: Finished task 2.0 in stage 24.0 (TID 128) in 163 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:00.664 Executor: INFO: Running task 3.0 in stage 24.0 (TID 129) 2023-04-22 21:12:00.690 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:00.819 Executor: INFO: Finished task 3.0 in stage 24.0 (TID 129). 1197 bytes result sent to driver 2023-04-22 21:12:00.823 TaskSetManager: INFO: Starting task 4.0 in stage 24.0 (TID 130) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:00.827 TaskSetManager: INFO: Finished task 3.0 in stage 24.0 (TID 129) in 164 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:00.828 Executor: INFO: Running task 4.0 in stage 24.0 (TID 130) 2023-04-22 21:12:00.854 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:00.979 Executor: INFO: Finished task 4.0 in stage 24.0 (TID 130). 1197 bytes result sent to driver 2023-04-22 21:12:00.979 TaskSetManager: INFO: Starting task 5.0 in stage 24.0 (TID 131) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:00.979 TaskSetManager: INFO: Finished task 4.0 in stage 24.0 (TID 130) in 156 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:00.980 Executor: INFO: Running task 5.0 in stage 24.0 (TID 131) 2023-04-22 21:12:01.007 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:01.132 Executor: INFO: Finished task 5.0 in stage 24.0 (TID 131). 1197 bytes result sent to driver 2023-04-22 21:12:01.132 TaskSetManager: INFO: Starting task 6.0 in stage 24.0 (TID 132) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:01.133 TaskSetManager: INFO: Finished task 5.0 in stage 24.0 (TID 131) in 154 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:01.134 Executor: INFO: Running task 6.0 in stage 24.0 (TID 132) 2023-04-22 21:12:01.160 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:01.284 Executor: INFO: Finished task 6.0 in stage 24.0 (TID 132). 1197 bytes result sent to driver 2023-04-22 21:12:01.284 TaskSetManager: INFO: Starting task 7.0 in stage 24.0 (TID 133) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:01.285 TaskSetManager: INFO: Finished task 6.0 in stage 24.0 (TID 132) in 153 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:01.286 Executor: INFO: Running task 7.0 in stage 24.0 (TID 133) 2023-04-22 21:12:01.314 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:01.436 Executor: INFO: Finished task 7.0 in stage 24.0 (TID 133). 1197 bytes result sent to driver 2023-04-22 21:12:01.437 TaskSetManager: INFO: Finished task 7.0 in stage 24.0 (TID 133) in 153 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:01.437 TaskSchedulerImpl: INFO: Removed TaskSet 24.0, whose tasks have all completed, from pool 2023-04-22 21:12:01.438 DAGScheduler: INFO: ShuffleMapStage 24 (treeAggregate at RowMatrix.scala:94) finished in 1.448 s 2023-04-22 21:12:01.438 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:01.438 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:01.438 DAGScheduler: INFO: waiting: Set(ResultStage 25) 2023-04-22 21:12:01.438 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:01.438 DAGScheduler: INFO: Submitting ResultStage 25 (MapPartitionsRDD[62] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:01.482 MemoryStore: INFO: Block broadcast_59 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:01.488 MemoryStore: INFO: Block broadcast_59_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:01.489 BlockManagerInfo: INFO: Added broadcast_59_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:01.490 SparkContext: INFO: Created broadcast 59 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:01.490 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 25 (MapPartitionsRDD[62] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:01.490 TaskSchedulerImpl: INFO: Adding task set 25.0 with 2 tasks resource profile 0 2023-04-22 21:12:01.491 TaskSetManager: INFO: Starting task 0.0 in stage 25.0 (TID 134) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:01.491 Executor: INFO: Running task 0.0 in stage 25.0 (TID 134) 2023-04-22 21:12:01.517 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:01.517 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:01.521 Executor: INFO: Finished task 0.0 in stage 25.0 (TID 134). 34646 bytes result sent to driver 2023-04-22 21:12:01.525 TaskSetManager: INFO: Starting task 1.0 in stage 25.0 (TID 135) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:01.525 TaskSetManager: INFO: Finished task 0.0 in stage 25.0 (TID 134) in 35 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:01.530 Executor: INFO: Running task 1.0 in stage 25.0 (TID 135) 2023-04-22 21:12:01.556 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:01.557 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:01.561 Executor: INFO: Finished task 1.0 in stage 25.0 (TID 135). 34646 bytes result sent to driver 2023-04-22 21:12:01.564 TaskSetManager: INFO: Finished task 1.0 in stage 25.0 (TID 135) in 39 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:01.564 TaskSchedulerImpl: INFO: Removed TaskSet 25.0, whose tasks have all completed, from pool 2023-04-22 21:12:01.564 DAGScheduler: INFO: ResultStage 25 (treeAggregate at RowMatrix.scala:94) finished in 0.126 s 2023-04-22 21:12:01.565 DAGScheduler: INFO: Job 13 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:01.565 TaskSchedulerImpl: INFO: Killing all running tasks in stage 25: Stage finished 2023-04-22 21:12:01.565 DAGScheduler: INFO: Job 13 finished: treeAggregate at RowMatrix.scala:94, took 1.578282 s 2023-04-22 21:12:01.568 MemoryStore: INFO: Block broadcast_60 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:01.570 MemoryStore: INFO: Block broadcast_60_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:01.575 BlockManagerInfo: INFO: Added broadcast_60_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:01.577 SparkContext: INFO: Created broadcast 60 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:01.645 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:01.646 DAGScheduler: INFO: Registering RDD 64 (treeAggregate at RowMatrix.scala:94) as input to shuffle 12 2023-04-22 21:12:01.646 DAGScheduler: INFO: Got job 14 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:01.646 DAGScheduler: INFO: Final stage: ResultStage 27 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:01.646 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 26) 2023-04-22 21:12:01.646 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 26) 2023-04-22 21:12:01.654 DAGScheduler: INFO: Submitting ShuffleMapStage 26 (MapPartitionsRDD[64] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:01.692 MemoryStore: INFO: Block broadcast_61 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:01.700 MemoryStore: INFO: Block broadcast_61_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:01.700 BlockManagerInfo: INFO: Added broadcast_61_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:01.719 SparkContext: INFO: Created broadcast 61 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:01.719 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 26 (MapPartitionsRDD[64] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:01.719 TaskSchedulerImpl: INFO: Adding task set 26.0 with 8 tasks resource profile 0 2023-04-22 21:12:01.720 TaskSetManager: INFO: Starting task 0.0 in stage 26.0 (TID 136) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:01.720 Executor: INFO: Running task 0.0 in stage 26.0 (TID 136) 2023-04-22 21:12:01.747 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:01.881 Executor: INFO: Finished task 0.0 in stage 26.0 (TID 136). 1197 bytes result sent to driver 2023-04-22 21:12:01.882 TaskSetManager: INFO: Starting task 1.0 in stage 26.0 (TID 137) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:01.883 TaskSetManager: INFO: Finished task 0.0 in stage 26.0 (TID 136) in 163 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:01.883 Executor: INFO: Running task 1.0 in stage 26.0 (TID 137) 2023-04-22 21:12:01.910 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:02.037 Executor: INFO: Finished task 1.0 in stage 26.0 (TID 137). 1197 bytes result sent to driver 2023-04-22 21:12:02.038 TaskSetManager: INFO: Starting task 2.0 in stage 26.0 (TID 138) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:02.038 TaskSetManager: INFO: Finished task 1.0 in stage 26.0 (TID 137) in 157 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:02.039 Executor: INFO: Running task 2.0 in stage 26.0 (TID 138) 2023-04-22 21:12:02.162 BlockManagerInfo: INFO: Removed broadcast_54_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:02.164 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:02.217 BlockManagerInfo: INFO: Removed broadcast_58_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:02.271 BlockManagerInfo: INFO: Removed broadcast_57_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:02.314 Executor: INFO: Finished task 2.0 in stage 26.0 (TID 138). 1240 bytes result sent to driver 2023-04-22 21:12:02.316 TaskSetManager: INFO: Starting task 3.0 in stage 26.0 (TID 139) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:02.323 TaskSetManager: INFO: Finished task 2.0 in stage 26.0 (TID 138) in 286 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:02.335 BlockManagerInfo: INFO: Removed broadcast_59_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:02.354 Executor: INFO: Running task 3.0 in stage 26.0 (TID 139) 2023-04-22 21:12:02.392 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:02.434 BlockManagerInfo: INFO: Removed broadcast_48_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:02.523 BlockManagerInfo: INFO: Removed broadcast_52_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:02.562 Executor: INFO: Finished task 3.0 in stage 26.0 (TID 139). 1197 bytes result sent to driver 2023-04-22 21:12:02.565 TaskSetManager: INFO: Starting task 4.0 in stage 26.0 (TID 140) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:02.565 TaskSetManager: INFO: Finished task 3.0 in stage 26.0 (TID 139) in 249 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:02.568 Executor: INFO: Running task 4.0 in stage 26.0 (TID 140) 2023-04-22 21:12:02.600 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:02.633 BlockManagerInfo: INFO: Removed broadcast_44_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:02.732 Executor: INFO: Finished task 4.0 in stage 26.0 (TID 140). 1197 bytes result sent to driver 2023-04-22 21:12:02.733 TaskSetManager: INFO: Starting task 5.0 in stage 26.0 (TID 141) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:02.733 TaskSetManager: INFO: Finished task 4.0 in stage 26.0 (TID 140) in 168 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:02.734 Executor: INFO: Running task 5.0 in stage 26.0 (TID 141) 2023-04-22 21:12:02.763 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:02.782 BlockManagerInfo: INFO: Removed broadcast_47_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:02.846 BlockManagerInfo: INFO: Removed broadcast_42_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:02.906 Executor: INFO: Finished task 5.0 in stage 26.0 (TID 141). 1197 bytes result sent to driver 2023-04-22 21:12:02.907 TaskSetManager: INFO: Starting task 6.0 in stage 26.0 (TID 142) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:02.907 TaskSetManager: INFO: Finished task 5.0 in stage 26.0 (TID 141) in 174 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:02.907 Executor: INFO: Running task 6.0 in stage 26.0 (TID 142) 2023-04-22 21:12:02.944 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:02.996 BlockManagerInfo: INFO: Removed broadcast_51_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:03.031 BlockManagerInfo: INFO: Removed broadcast_46_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:03.058 BlockManagerInfo: INFO: Removed broadcast_50_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:03.088 BlockManagerInfo: INFO: Removed broadcast_45_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:03.114 BlockManagerInfo: INFO: Removed broadcast_49_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:03.125 Executor: INFO: Finished task 6.0 in stage 26.0 (TID 142). 1197 bytes result sent to driver 2023-04-22 21:12:03.127 TaskSetManager: INFO: Starting task 7.0 in stage 26.0 (TID 143) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:03.128 TaskSetManager: INFO: Finished task 6.0 in stage 26.0 (TID 142) in 221 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:03.128 Executor: INFO: Running task 7.0 in stage 26.0 (TID 143) 2023-04-22 21:12:03.162 BlockManagerInfo: INFO: Removed broadcast_55_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:03.168 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:03.211 BlockManagerInfo: INFO: Removed broadcast_53_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:03.267 BlockManagerInfo: INFO: Removed broadcast_56_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:03.332 Executor: INFO: Finished task 7.0 in stage 26.0 (TID 143). 1197 bytes result sent to driver 2023-04-22 21:12:03.334 TaskSetManager: INFO: Finished task 7.0 in stage 26.0 (TID 143) in 207 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:03.334 TaskSchedulerImpl: INFO: Removed TaskSet 26.0, whose tasks have all completed, from pool 2023-04-22 21:12:03.334 DAGScheduler: INFO: ShuffleMapStage 26 (treeAggregate at RowMatrix.scala:94) finished in 1.680 s 2023-04-22 21:12:03.334 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:03.334 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:03.334 DAGScheduler: INFO: waiting: Set(ResultStage 27) 2023-04-22 21:12:03.334 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:03.334 DAGScheduler: INFO: Submitting ResultStage 27 (MapPartitionsRDD[66] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:03.375 MemoryStore: INFO: Block broadcast_62 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:03.381 MemoryStore: INFO: Block broadcast_62_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:03.381 BlockManagerInfo: INFO: Added broadcast_62_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:03.402 SparkContext: INFO: Created broadcast 62 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:03.402 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 27 (MapPartitionsRDD[66] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:03.402 TaskSchedulerImpl: INFO: Adding task set 27.0 with 2 tasks resource profile 0 2023-04-22 21:12:03.403 TaskSetManager: INFO: Starting task 0.0 in stage 27.0 (TID 144) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:03.403 Executor: INFO: Running task 0.0 in stage 27.0 (TID 144) 2023-04-22 21:12:03.447 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:03.447 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:03.452 Executor: INFO: Finished task 0.0 in stage 27.0 (TID 144). 34646 bytes result sent to driver 2023-04-22 21:12:03.453 TaskSetManager: INFO: Starting task 1.0 in stage 27.0 (TID 145) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:03.453 TaskSetManager: INFO: Finished task 0.0 in stage 27.0 (TID 144) in 50 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:03.467 Executor: INFO: Running task 1.0 in stage 27.0 (TID 145) 2023-04-22 21:12:03.506 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:03.506 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:03.511 Executor: INFO: Finished task 1.0 in stage 27.0 (TID 145). 34646 bytes result sent to driver 2023-04-22 21:12:03.512 TaskSetManager: INFO: Finished task 1.0 in stage 27.0 (TID 145) in 59 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:03.512 TaskSchedulerImpl: INFO: Removed TaskSet 27.0, whose tasks have all completed, from pool 2023-04-22 21:12:03.512 DAGScheduler: INFO: ResultStage 27 (treeAggregate at RowMatrix.scala:94) finished in 0.177 s 2023-04-22 21:12:03.512 DAGScheduler: INFO: Job 14 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:03.512 TaskSchedulerImpl: INFO: Killing all running tasks in stage 27: Stage finished 2023-04-22 21:12:03.513 DAGScheduler: INFO: Job 14 finished: treeAggregate at RowMatrix.scala:94, took 1.867810 s 2023-04-22 21:12:03.516 MemoryStore: INFO: Block broadcast_63 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:03.517 MemoryStore: INFO: Block broadcast_63_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:03.518 BlockManagerInfo: INFO: Added broadcast_63_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:03.519 SparkContext: INFO: Created broadcast 63 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:03.636 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:03.638 DAGScheduler: INFO: Registering RDD 68 (treeAggregate at RowMatrix.scala:94) as input to shuffle 13 2023-04-22 21:12:03.639 DAGScheduler: INFO: Got job 15 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:03.639 DAGScheduler: INFO: Final stage: ResultStage 29 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:03.639 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 28) 2023-04-22 21:12:03.639 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 28) 2023-04-22 21:12:03.640 DAGScheduler: INFO: Submitting ShuffleMapStage 28 (MapPartitionsRDD[68] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:03.678 MemoryStore: INFO: Block broadcast_64 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:03.686 MemoryStore: INFO: Block broadcast_64_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:03.687 BlockManagerInfo: INFO: Added broadcast_64_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:03.687 SparkContext: INFO: Created broadcast 64 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:03.687 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 28 (MapPartitionsRDD[68] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:03.687 TaskSchedulerImpl: INFO: Adding task set 28.0 with 8 tasks resource profile 0 2023-04-22 21:12:03.688 TaskSetManager: INFO: Starting task 0.0 in stage 28.0 (TID 146) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:03.689 Executor: INFO: Running task 0.0 in stage 28.0 (TID 146) 2023-04-22 21:12:03.714 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:03.841 Executor: INFO: Finished task 0.0 in stage 28.0 (TID 146). 1197 bytes result sent to driver 2023-04-22 21:12:03.850 TaskSetManager: INFO: Starting task 1.0 in stage 28.0 (TID 147) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:03.851 TaskSetManager: INFO: Finished task 0.0 in stage 28.0 (TID 146) in 163 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:03.864 Executor: INFO: Running task 1.0 in stage 28.0 (TID 147) 2023-04-22 21:12:03.914 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:04.124 Executor: INFO: Finished task 1.0 in stage 28.0 (TID 147). 1197 bytes result sent to driver 2023-04-22 21:12:04.124 TaskSetManager: INFO: Starting task 2.0 in stage 28.0 (TID 148) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:04.125 TaskSetManager: INFO: Finished task 1.0 in stage 28.0 (TID 147) in 275 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:04.128 Executor: INFO: Running task 2.0 in stage 28.0 (TID 148) 2023-04-22 21:12:04.153 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:04.288 Executor: INFO: Finished task 2.0 in stage 28.0 (TID 148). 1197 bytes result sent to driver 2023-04-22 21:12:04.291 TaskSetManager: INFO: Starting task 3.0 in stage 28.0 (TID 149) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:04.292 TaskSetManager: INFO: Finished task 2.0 in stage 28.0 (TID 148) in 168 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:04.303 Executor: INFO: Running task 3.0 in stage 28.0 (TID 149) 2023-04-22 21:12:04.328 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:04.452 Executor: INFO: Finished task 3.0 in stage 28.0 (TID 149). 1197 bytes result sent to driver 2023-04-22 21:12:04.452 TaskSetManager: INFO: Starting task 4.0 in stage 28.0 (TID 150) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:04.452 TaskSetManager: INFO: Finished task 3.0 in stage 28.0 (TID 149) in 161 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:04.453 Executor: INFO: Running task 4.0 in stage 28.0 (TID 150) 2023-04-22 21:12:04.479 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:04.609 Executor: INFO: Finished task 4.0 in stage 28.0 (TID 150). 1197 bytes result sent to driver 2023-04-22 21:12:04.609 TaskSetManager: INFO: Starting task 5.0 in stage 28.0 (TID 151) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:04.609 TaskSetManager: INFO: Finished task 4.0 in stage 28.0 (TID 150) in 157 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:04.610 Executor: INFO: Running task 5.0 in stage 28.0 (TID 151) 2023-04-22 21:12:04.637 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:04.762 Executor: INFO: Finished task 5.0 in stage 28.0 (TID 151). 1197 bytes result sent to driver 2023-04-22 21:12:04.771 TaskSetManager: INFO: Starting task 6.0 in stage 28.0 (TID 152) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:04.772 TaskSetManager: INFO: Finished task 5.0 in stage 28.0 (TID 151) in 163 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:04.776 Executor: INFO: Running task 6.0 in stage 28.0 (TID 152) 2023-04-22 21:12:04.802 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:04.929 Executor: INFO: Finished task 6.0 in stage 28.0 (TID 152). 1197 bytes result sent to driver 2023-04-22 21:12:04.929 TaskSetManager: INFO: Starting task 7.0 in stage 28.0 (TID 153) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:04.930 TaskSetManager: INFO: Finished task 6.0 in stage 28.0 (TID 152) in 159 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:04.932 Executor: INFO: Running task 7.0 in stage 28.0 (TID 153) 2023-04-22 21:12:04.957 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:05.085 Executor: INFO: Finished task 7.0 in stage 28.0 (TID 153). 1197 bytes result sent to driver 2023-04-22 21:12:05.086 TaskSetManager: INFO: Finished task 7.0 in stage 28.0 (TID 153) in 157 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:05.086 TaskSchedulerImpl: INFO: Removed TaskSet 28.0, whose tasks have all completed, from pool 2023-04-22 21:12:05.086 DAGScheduler: INFO: ShuffleMapStage 28 (treeAggregate at RowMatrix.scala:94) finished in 1.445 s 2023-04-22 21:12:05.086 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:05.086 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:05.086 DAGScheduler: INFO: waiting: Set(ResultStage 29) 2023-04-22 21:12:05.086 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:05.086 DAGScheduler: INFO: Submitting ResultStage 29 (MapPartitionsRDD[70] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:05.128 MemoryStore: INFO: Block broadcast_65 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:05.134 MemoryStore: INFO: Block broadcast_65_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:05.135 BlockManagerInfo: INFO: Added broadcast_65_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:05.136 SparkContext: INFO: Created broadcast 65 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:05.136 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 29 (MapPartitionsRDD[70] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:05.136 TaskSchedulerImpl: INFO: Adding task set 29.0 with 2 tasks resource profile 0 2023-04-22 21:12:05.137 TaskSetManager: INFO: Starting task 0.0 in stage 29.0 (TID 154) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:05.137 Executor: INFO: Running task 0.0 in stage 29.0 (TID 154) 2023-04-22 21:12:05.163 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:05.163 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:05.168 Executor: INFO: Finished task 0.0 in stage 29.0 (TID 154). 34646 bytes result sent to driver 2023-04-22 21:12:05.170 TaskSetManager: INFO: Starting task 1.0 in stage 29.0 (TID 155) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:05.170 TaskSetManager: INFO: Finished task 0.0 in stage 29.0 (TID 154) in 33 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:05.171 Executor: INFO: Running task 1.0 in stage 29.0 (TID 155) 2023-04-22 21:12:05.198 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:05.198 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:05.202 Executor: INFO: Finished task 1.0 in stage 29.0 (TID 155). 34646 bytes result sent to driver 2023-04-22 21:12:05.204 TaskSetManager: INFO: Finished task 1.0 in stage 29.0 (TID 155) in 34 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:05.204 TaskSchedulerImpl: INFO: Removed TaskSet 29.0, whose tasks have all completed, from pool 2023-04-22 21:12:05.204 DAGScheduler: INFO: ResultStage 29 (treeAggregate at RowMatrix.scala:94) finished in 0.117 s 2023-04-22 21:12:05.204 DAGScheduler: INFO: Job 15 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:05.204 TaskSchedulerImpl: INFO: Killing all running tasks in stage 29: Stage finished 2023-04-22 21:12:05.205 DAGScheduler: INFO: Job 15 finished: treeAggregate at RowMatrix.scala:94, took 1.568457 s 2023-04-22 21:12:05.207 MemoryStore: INFO: Block broadcast_66 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:05.209 MemoryStore: INFO: Block broadcast_66_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:05.213 BlockManagerInfo: INFO: Added broadcast_66_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:05.214 SparkContext: INFO: Created broadcast 66 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:05.279 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:05.280 DAGScheduler: INFO: Registering RDD 72 (treeAggregate at RowMatrix.scala:94) as input to shuffle 14 2023-04-22 21:12:05.280 DAGScheduler: INFO: Got job 16 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:05.280 DAGScheduler: INFO: Final stage: ResultStage 31 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:05.280 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 30) 2023-04-22 21:12:05.280 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 30) 2023-04-22 21:12:05.286 DAGScheduler: INFO: Submitting ShuffleMapStage 30 (MapPartitionsRDD[72] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:05.321 MemoryStore: INFO: Block broadcast_67 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:05.327 MemoryStore: INFO: Block broadcast_67_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:05.340 BlockManagerInfo: INFO: Added broadcast_67_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:05.340 SparkContext: INFO: Created broadcast 67 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:05.341 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 30 (MapPartitionsRDD[72] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:05.341 TaskSchedulerImpl: INFO: Adding task set 30.0 with 8 tasks resource profile 0 2023-04-22 21:12:05.342 TaskSetManager: INFO: Starting task 0.0 in stage 30.0 (TID 156) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:05.342 Executor: INFO: Running task 0.0 in stage 30.0 (TID 156) 2023-04-22 21:12:05.371 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:05.535 Executor: INFO: Finished task 0.0 in stage 30.0 (TID 156). 1197 bytes result sent to driver 2023-04-22 21:12:05.536 TaskSetManager: INFO: Starting task 1.0 in stage 30.0 (TID 157) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:05.537 TaskSetManager: INFO: Finished task 0.0 in stage 30.0 (TID 156) in 195 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:05.537 Executor: INFO: Running task 1.0 in stage 30.0 (TID 157) 2023-04-22 21:12:05.563 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:05.689 Executor: INFO: Finished task 1.0 in stage 30.0 (TID 157). 1197 bytes result sent to driver 2023-04-22 21:12:05.691 TaskSetManager: INFO: Starting task 2.0 in stage 30.0 (TID 158) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:05.691 TaskSetManager: INFO: Finished task 1.0 in stage 30.0 (TID 157) in 155 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:05.692 Executor: INFO: Running task 2.0 in stage 30.0 (TID 158) 2023-04-22 21:12:05.717 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:05.845 Executor: INFO: Finished task 2.0 in stage 30.0 (TID 158). 1197 bytes result sent to driver 2023-04-22 21:12:05.846 TaskSetManager: INFO: Starting task 3.0 in stage 30.0 (TID 159) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:05.846 TaskSetManager: INFO: Finished task 2.0 in stage 30.0 (TID 158) in 155 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:05.848 Executor: INFO: Running task 3.0 in stage 30.0 (TID 159) 2023-04-22 21:12:05.874 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:05.998 Executor: INFO: Finished task 3.0 in stage 30.0 (TID 159). 1197 bytes result sent to driver 2023-04-22 21:12:05.999 TaskSetManager: INFO: Starting task 4.0 in stage 30.0 (TID 160) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:05.999 TaskSetManager: INFO: Finished task 3.0 in stage 30.0 (TID 159) in 154 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:05.999 Executor: INFO: Running task 4.0 in stage 30.0 (TID 160) 2023-04-22 21:12:06.025 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:06.154 Executor: INFO: Finished task 4.0 in stage 30.0 (TID 160). 1197 bytes result sent to driver 2023-04-22 21:12:06.154 TaskSetManager: INFO: Starting task 5.0 in stage 30.0 (TID 161) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:06.154 TaskSetManager: INFO: Finished task 4.0 in stage 30.0 (TID 160) in 156 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:06.158 Executor: INFO: Running task 5.0 in stage 30.0 (TID 161) 2023-04-22 21:12:06.183 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:06.322 Executor: INFO: Finished task 5.0 in stage 30.0 (TID 161). 1197 bytes result sent to driver 2023-04-22 21:12:06.323 TaskSetManager: INFO: Starting task 6.0 in stage 30.0 (TID 162) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:06.324 TaskSetManager: INFO: Finished task 5.0 in stage 30.0 (TID 161) in 170 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:06.325 Executor: INFO: Running task 6.0 in stage 30.0 (TID 162) 2023-04-22 21:12:06.350 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:06.478 Executor: INFO: Finished task 6.0 in stage 30.0 (TID 162). 1197 bytes result sent to driver 2023-04-22 21:12:06.478 TaskSetManager: INFO: Starting task 7.0 in stage 30.0 (TID 163) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:06.479 TaskSetManager: INFO: Finished task 6.0 in stage 30.0 (TID 162) in 156 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:06.480 Executor: INFO: Running task 7.0 in stage 30.0 (TID 163) 2023-04-22 21:12:06.508 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:06.629 Executor: INFO: Finished task 7.0 in stage 30.0 (TID 163). 1197 bytes result sent to driver 2023-04-22 21:12:06.630 TaskSetManager: INFO: Finished task 7.0 in stage 30.0 (TID 163) in 152 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:06.630 TaskSchedulerImpl: INFO: Removed TaskSet 30.0, whose tasks have all completed, from pool 2023-04-22 21:12:06.630 DAGScheduler: INFO: ShuffleMapStage 30 (treeAggregate at RowMatrix.scala:94) finished in 1.344 s 2023-04-22 21:12:06.630 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:06.630 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:06.630 DAGScheduler: INFO: waiting: Set(ResultStage 31) 2023-04-22 21:12:06.630 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:06.631 DAGScheduler: INFO: Submitting ResultStage 31 (MapPartitionsRDD[74] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:06.687 MemoryStore: INFO: Block broadcast_68 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:06.694 MemoryStore: INFO: Block broadcast_68_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:06.695 BlockManagerInfo: INFO: Added broadcast_68_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:06.695 SparkContext: INFO: Created broadcast 68 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:06.695 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 31 (MapPartitionsRDD[74] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:06.695 TaskSchedulerImpl: INFO: Adding task set 31.0 with 2 tasks resource profile 0 2023-04-22 21:12:06.696 TaskSetManager: INFO: Starting task 0.0 in stage 31.0 (TID 164) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:06.699 Executor: INFO: Running task 0.0 in stage 31.0 (TID 164) 2023-04-22 21:12:06.726 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:06.726 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 1 ms 2023-04-22 21:12:06.733 Executor: INFO: Finished task 0.0 in stage 31.0 (TID 164). 34646 bytes result sent to driver 2023-04-22 21:12:06.736 TaskSetManager: INFO: Starting task 1.0 in stage 31.0 (TID 165) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:06.736 TaskSetManager: INFO: Finished task 0.0 in stage 31.0 (TID 164) in 40 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:06.737 Executor: INFO: Running task 1.0 in stage 31.0 (TID 165) 2023-04-22 21:12:06.762 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:06.762 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:06.767 Executor: INFO: Finished task 1.0 in stage 31.0 (TID 165). 34646 bytes result sent to driver 2023-04-22 21:12:06.768 TaskSetManager: INFO: Finished task 1.0 in stage 31.0 (TID 165) in 32 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:06.768 TaskSchedulerImpl: INFO: Removed TaskSet 31.0, whose tasks have all completed, from pool 2023-04-22 21:12:06.768 DAGScheduler: INFO: ResultStage 31 (treeAggregate at RowMatrix.scala:94) finished in 0.137 s 2023-04-22 21:12:06.768 DAGScheduler: INFO: Job 16 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:06.769 TaskSchedulerImpl: INFO: Killing all running tasks in stage 31: Stage finished 2023-04-22 21:12:06.769 DAGScheduler: INFO: Job 16 finished: treeAggregate at RowMatrix.scala:94, took 1.489829 s 2023-04-22 21:12:06.771 MemoryStore: INFO: Block broadcast_69 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:06.773 MemoryStore: INFO: Block broadcast_69_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:06.773 BlockManagerInfo: INFO: Added broadcast_69_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:06.775 SparkContext: INFO: Created broadcast 69 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:06.839 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:06.841 DAGScheduler: INFO: Registering RDD 76 (treeAggregate at RowMatrix.scala:94) as input to shuffle 15 2023-04-22 21:12:06.841 DAGScheduler: INFO: Got job 17 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:06.841 DAGScheduler: INFO: Final stage: ResultStage 33 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:06.841 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 32) 2023-04-22 21:12:06.841 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 32) 2023-04-22 21:12:06.844 DAGScheduler: INFO: Submitting ShuffleMapStage 32 (MapPartitionsRDD[76] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:06.880 MemoryStore: INFO: Block broadcast_70 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:06.886 MemoryStore: INFO: Block broadcast_70_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:06.887 BlockManagerInfo: INFO: Added broadcast_70_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:06.889 SparkContext: INFO: Created broadcast 70 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:06.890 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 32 (MapPartitionsRDD[76] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:06.890 TaskSchedulerImpl: INFO: Adding task set 32.0 with 8 tasks resource profile 0 2023-04-22 21:12:06.891 TaskSetManager: INFO: Starting task 0.0 in stage 32.0 (TID 166) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:06.891 Executor: INFO: Running task 0.0 in stage 32.0 (TID 166) 2023-04-22 21:12:06.916 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:07.041 Executor: INFO: Finished task 0.0 in stage 32.0 (TID 166). 1197 bytes result sent to driver 2023-04-22 21:12:07.041 TaskSetManager: INFO: Starting task 1.0 in stage 32.0 (TID 167) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:07.042 TaskSetManager: INFO: Finished task 0.0 in stage 32.0 (TID 166) in 152 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:07.042 Executor: INFO: Running task 1.0 in stage 32.0 (TID 167) 2023-04-22 21:12:07.068 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:07.192 Executor: INFO: Finished task 1.0 in stage 32.0 (TID 167). 1197 bytes result sent to driver 2023-04-22 21:12:07.193 TaskSetManager: INFO: Starting task 2.0 in stage 32.0 (TID 168) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:07.194 TaskSetManager: INFO: Finished task 1.0 in stage 32.0 (TID 167) in 153 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:07.195 Executor: INFO: Running task 2.0 in stage 32.0 (TID 168) 2023-04-22 21:12:07.220 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:07.361 Executor: INFO: Finished task 2.0 in stage 32.0 (TID 168). 1197 bytes result sent to driver 2023-04-22 21:12:07.362 TaskSetManager: INFO: Starting task 3.0 in stage 32.0 (TID 169) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:07.362 TaskSetManager: INFO: Finished task 2.0 in stage 32.0 (TID 168) in 169 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:07.363 Executor: INFO: Running task 3.0 in stage 32.0 (TID 169) 2023-04-22 21:12:07.388 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:07.523 Executor: INFO: Finished task 3.0 in stage 32.0 (TID 169). 1197 bytes result sent to driver 2023-04-22 21:12:07.524 TaskSetManager: INFO: Starting task 4.0 in stage 32.0 (TID 170) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:07.524 TaskSetManager: INFO: Finished task 3.0 in stage 32.0 (TID 169) in 162 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:07.525 Executor: INFO: Running task 4.0 in stage 32.0 (TID 170) 2023-04-22 21:12:07.550 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:07.674 Executor: INFO: Finished task 4.0 in stage 32.0 (TID 170). 1197 bytes result sent to driver 2023-04-22 21:12:07.674 TaskSetManager: INFO: Starting task 5.0 in stage 32.0 (TID 171) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:07.675 TaskSetManager: INFO: Finished task 4.0 in stage 32.0 (TID 170) in 152 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:07.677 Executor: INFO: Running task 5.0 in stage 32.0 (TID 171) 2023-04-22 21:12:07.703 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:07.834 Executor: INFO: Finished task 5.0 in stage 32.0 (TID 171). 1197 bytes result sent to driver 2023-04-22 21:12:07.835 TaskSetManager: INFO: Starting task 6.0 in stage 32.0 (TID 172) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:07.835 TaskSetManager: INFO: Finished task 5.0 in stage 32.0 (TID 171) in 161 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:07.837 Executor: INFO: Running task 6.0 in stage 32.0 (TID 172) 2023-04-22 21:12:07.862 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:07.989 Executor: INFO: Finished task 6.0 in stage 32.0 (TID 172). 1197 bytes result sent to driver 2023-04-22 21:12:07.990 TaskSetManager: INFO: Starting task 7.0 in stage 32.0 (TID 173) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:07.990 TaskSetManager: INFO: Finished task 6.0 in stage 32.0 (TID 172) in 155 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:08.004 Executor: INFO: Running task 7.0 in stage 32.0 (TID 173) 2023-04-22 21:12:08.029 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:08.150 Executor: INFO: Finished task 7.0 in stage 32.0 (TID 173). 1197 bytes result sent to driver 2023-04-22 21:12:08.152 TaskSetManager: INFO: Finished task 7.0 in stage 32.0 (TID 173) in 162 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:08.152 TaskSchedulerImpl: INFO: Removed TaskSet 32.0, whose tasks have all completed, from pool 2023-04-22 21:12:08.153 DAGScheduler: INFO: ShuffleMapStage 32 (treeAggregate at RowMatrix.scala:94) finished in 1.307 s 2023-04-22 21:12:08.153 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:08.153 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:08.153 DAGScheduler: INFO: waiting: Set(ResultStage 33) 2023-04-22 21:12:08.153 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:08.153 DAGScheduler: INFO: Submitting ResultStage 33 (MapPartitionsRDD[78] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:08.196 MemoryStore: INFO: Block broadcast_71 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:08.202 MemoryStore: INFO: Block broadcast_71_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:08.203 BlockManagerInfo: INFO: Added broadcast_71_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:08.204 SparkContext: INFO: Created broadcast 71 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:08.204 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 33 (MapPartitionsRDD[78] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:08.204 TaskSchedulerImpl: INFO: Adding task set 33.0 with 2 tasks resource profile 0 2023-04-22 21:12:08.205 TaskSetManager: INFO: Starting task 0.0 in stage 33.0 (TID 174) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:08.205 Executor: INFO: Running task 0.0 in stage 33.0 (TID 174) 2023-04-22 21:12:08.232 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:08.232 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:08.236 Executor: INFO: Finished task 0.0 in stage 33.0 (TID 174). 34646 bytes result sent to driver 2023-04-22 21:12:08.242 TaskSetManager: INFO: Starting task 1.0 in stage 33.0 (TID 175) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:08.242 TaskSetManager: INFO: Finished task 0.0 in stage 33.0 (TID 174) in 37 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:08.247 Executor: INFO: Running task 1.0 in stage 33.0 (TID 175) 2023-04-22 21:12:08.273 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:08.273 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:08.277 Executor: INFO: Finished task 1.0 in stage 33.0 (TID 175). 34646 bytes result sent to driver 2023-04-22 21:12:08.279 TaskSetManager: INFO: Finished task 1.0 in stage 33.0 (TID 175) in 37 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:08.279 TaskSchedulerImpl: INFO: Removed TaskSet 33.0, whose tasks have all completed, from pool 2023-04-22 21:12:08.279 DAGScheduler: INFO: ResultStage 33 (treeAggregate at RowMatrix.scala:94) finished in 0.126 s 2023-04-22 21:12:08.280 DAGScheduler: INFO: Job 17 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:08.280 TaskSchedulerImpl: INFO: Killing all running tasks in stage 33: Stage finished 2023-04-22 21:12:08.281 DAGScheduler: INFO: Job 17 finished: treeAggregate at RowMatrix.scala:94, took 1.441474 s 2023-04-22 21:12:08.283 MemoryStore: INFO: Block broadcast_72 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:08.285 MemoryStore: INFO: Block broadcast_72_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:08.285 BlockManagerInfo: INFO: Added broadcast_72_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:08.288 SparkContext: INFO: Created broadcast 72 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:08.364 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:08.365 DAGScheduler: INFO: Registering RDD 80 (treeAggregate at RowMatrix.scala:94) as input to shuffle 16 2023-04-22 21:12:08.365 DAGScheduler: INFO: Got job 18 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:08.365 DAGScheduler: INFO: Final stage: ResultStage 35 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:08.365 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 34) 2023-04-22 21:12:08.366 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 34) 2023-04-22 21:12:08.368 DAGScheduler: INFO: Submitting ShuffleMapStage 34 (MapPartitionsRDD[80] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:08.405 MemoryStore: INFO: Block broadcast_73 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:08.412 MemoryStore: INFO: Block broadcast_73_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:08.420 BlockManagerInfo: INFO: Added broadcast_73_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:08.420 SparkContext: INFO: Created broadcast 73 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:08.421 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 34 (MapPartitionsRDD[80] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:08.421 TaskSchedulerImpl: INFO: Adding task set 34.0 with 8 tasks resource profile 0 2023-04-22 21:12:08.422 TaskSetManager: INFO: Starting task 0.0 in stage 34.0 (TID 176) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:08.422 Executor: INFO: Running task 0.0 in stage 34.0 (TID 176) 2023-04-22 21:12:08.466 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:08.597 Executor: INFO: Finished task 0.0 in stage 34.0 (TID 176). 1197 bytes result sent to driver 2023-04-22 21:12:08.597 TaskSetManager: INFO: Starting task 1.0 in stage 34.0 (TID 177) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:08.598 TaskSetManager: INFO: Finished task 0.0 in stage 34.0 (TID 176) in 176 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:08.598 Executor: INFO: Running task 1.0 in stage 34.0 (TID 177) 2023-04-22 21:12:08.624 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:08.748 Executor: INFO: Finished task 1.0 in stage 34.0 (TID 177). 1197 bytes result sent to driver 2023-04-22 21:12:08.749 TaskSetManager: INFO: Starting task 2.0 in stage 34.0 (TID 178) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:08.750 TaskSetManager: INFO: Finished task 1.0 in stage 34.0 (TID 177) in 153 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:08.751 Executor: INFO: Running task 2.0 in stage 34.0 (TID 178) 2023-04-22 21:12:08.777 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:08.899 Executor: INFO: Finished task 2.0 in stage 34.0 (TID 178). 1197 bytes result sent to driver 2023-04-22 21:12:08.900 TaskSetManager: INFO: Starting task 3.0 in stage 34.0 (TID 179) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:08.900 TaskSetManager: INFO: Finished task 2.0 in stage 34.0 (TID 178) in 151 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:08.901 Executor: INFO: Running task 3.0 in stage 34.0 (TID 179) 2023-04-22 21:12:08.926 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:09.054 Executor: INFO: Finished task 3.0 in stage 34.0 (TID 179). 1197 bytes result sent to driver 2023-04-22 21:12:09.054 TaskSetManager: INFO: Starting task 4.0 in stage 34.0 (TID 180) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:09.055 TaskSetManager: INFO: Finished task 3.0 in stage 34.0 (TID 179) in 155 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:09.055 Executor: INFO: Running task 4.0 in stage 34.0 (TID 180) 2023-04-22 21:12:09.080 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:09.209 Executor: INFO: Finished task 4.0 in stage 34.0 (TID 180). 1197 bytes result sent to driver 2023-04-22 21:12:09.211 TaskSetManager: INFO: Starting task 5.0 in stage 34.0 (TID 181) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:09.211 TaskSetManager: INFO: Finished task 4.0 in stage 34.0 (TID 180) in 157 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:09.212 Executor: INFO: Running task 5.0 in stage 34.0 (TID 181) 2023-04-22 21:12:09.238 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:09.378 Executor: INFO: Finished task 5.0 in stage 34.0 (TID 181). 1197 bytes result sent to driver 2023-04-22 21:12:09.379 TaskSetManager: INFO: Starting task 6.0 in stage 34.0 (TID 182) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:09.379 TaskSetManager: INFO: Finished task 5.0 in stage 34.0 (TID 181) in 168 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:09.392 Executor: INFO: Running task 6.0 in stage 34.0 (TID 182) 2023-04-22 21:12:09.430 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:09.551 Executor: INFO: Finished task 6.0 in stage 34.0 (TID 182). 1197 bytes result sent to driver 2023-04-22 21:12:09.551 TaskSetManager: INFO: Starting task 7.0 in stage 34.0 (TID 183) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:09.557 TaskSetManager: INFO: Finished task 6.0 in stage 34.0 (TID 182) in 179 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:09.558 Executor: INFO: Running task 7.0 in stage 34.0 (TID 183) 2023-04-22 21:12:09.585 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:09.709 Executor: INFO: Finished task 7.0 in stage 34.0 (TID 183). 1197 bytes result sent to driver 2023-04-22 21:12:09.710 TaskSetManager: INFO: Finished task 7.0 in stage 34.0 (TID 183) in 159 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:09.710 TaskSchedulerImpl: INFO: Removed TaskSet 34.0, whose tasks have all completed, from pool 2023-04-22 21:12:09.710 DAGScheduler: INFO: ShuffleMapStage 34 (treeAggregate at RowMatrix.scala:94) finished in 1.341 s 2023-04-22 21:12:09.710 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:09.710 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:09.710 DAGScheduler: INFO: waiting: Set(ResultStage 35) 2023-04-22 21:12:09.710 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:09.711 DAGScheduler: INFO: Submitting ResultStage 35 (MapPartitionsRDD[82] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:09.751 MemoryStore: INFO: Block broadcast_74 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:09.757 MemoryStore: INFO: Block broadcast_74_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:09.758 BlockManagerInfo: INFO: Added broadcast_74_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:09.759 SparkContext: INFO: Created broadcast 74 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:09.759 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 35 (MapPartitionsRDD[82] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:09.759 TaskSchedulerImpl: INFO: Adding task set 35.0 with 2 tasks resource profile 0 2023-04-22 21:12:09.761 TaskSetManager: INFO: Starting task 0.0 in stage 35.0 (TID 184) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:09.761 Executor: INFO: Running task 0.0 in stage 35.0 (TID 184) 2023-04-22 21:12:09.789 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:09.789 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:09.793 Executor: INFO: Finished task 0.0 in stage 35.0 (TID 184). 34646 bytes result sent to driver 2023-04-22 21:12:09.796 TaskSetManager: INFO: Starting task 1.0 in stage 35.0 (TID 185) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:09.796 TaskSetManager: INFO: Finished task 0.0 in stage 35.0 (TID 184) in 35 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:09.797 Executor: INFO: Running task 1.0 in stage 35.0 (TID 185) 2023-04-22 21:12:09.822 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:09.822 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:09.826 Executor: INFO: Finished task 1.0 in stage 35.0 (TID 185). 34646 bytes result sent to driver 2023-04-22 21:12:09.827 TaskSetManager: INFO: Finished task 1.0 in stage 35.0 (TID 185) in 31 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:09.827 TaskSchedulerImpl: INFO: Removed TaskSet 35.0, whose tasks have all completed, from pool 2023-04-22 21:12:09.828 DAGScheduler: INFO: ResultStage 35 (treeAggregate at RowMatrix.scala:94) finished in 0.117 s 2023-04-22 21:12:09.828 DAGScheduler: INFO: Job 18 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:09.828 TaskSchedulerImpl: INFO: Killing all running tasks in stage 35: Stage finished 2023-04-22 21:12:09.828 DAGScheduler: INFO: Job 18 finished: treeAggregate at RowMatrix.scala:94, took 1.464593 s 2023-04-22 21:12:09.831 MemoryStore: INFO: Block broadcast_75 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:09.832 MemoryStore: INFO: Block broadcast_75_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:09.833 BlockManagerInfo: INFO: Added broadcast_75_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:09.835 SparkContext: INFO: Created broadcast 75 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:09.906 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:09.907 DAGScheduler: INFO: Registering RDD 84 (treeAggregate at RowMatrix.scala:94) as input to shuffle 17 2023-04-22 21:12:09.908 DAGScheduler: INFO: Got job 19 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:09.908 DAGScheduler: INFO: Final stage: ResultStage 37 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:09.908 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 36) 2023-04-22 21:12:09.908 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 36) 2023-04-22 21:12:09.909 DAGScheduler: INFO: Submitting ShuffleMapStage 36 (MapPartitionsRDD[84] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:09.945 MemoryStore: INFO: Block broadcast_76 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:09.951 MemoryStore: INFO: Block broadcast_76_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:09.951 BlockManagerInfo: INFO: Added broadcast_76_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:09.955 SparkContext: INFO: Created broadcast 76 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:09.955 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 36 (MapPartitionsRDD[84] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:09.955 TaskSchedulerImpl: INFO: Adding task set 36.0 with 8 tasks resource profile 0 2023-04-22 21:12:09.957 TaskSetManager: INFO: Starting task 0.0 in stage 36.0 (TID 186) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:09.957 Executor: INFO: Running task 0.0 in stage 36.0 (TID 186) 2023-04-22 21:12:09.982 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:10.106 Executor: INFO: Finished task 0.0 in stage 36.0 (TID 186). 1197 bytes result sent to driver 2023-04-22 21:12:10.106 TaskSetManager: INFO: Starting task 1.0 in stage 36.0 (TID 187) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:10.106 TaskSetManager: INFO: Finished task 0.0 in stage 36.0 (TID 186) in 149 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:10.107 Executor: INFO: Running task 1.0 in stage 36.0 (TID 187) 2023-04-22 21:12:10.132 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:10.393 Executor: INFO: Finished task 1.0 in stage 36.0 (TID 187). 1197 bytes result sent to driver 2023-04-22 21:12:10.394 TaskSetManager: INFO: Starting task 2.0 in stage 36.0 (TID 188) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:10.396 TaskSetManager: INFO: Finished task 1.0 in stage 36.0 (TID 187) in 290 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:10.397 Executor: INFO: Running task 2.0 in stage 36.0 (TID 188) 2023-04-22 21:12:10.423 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:10.545 Executor: INFO: Finished task 2.0 in stage 36.0 (TID 188). 1197 bytes result sent to driver 2023-04-22 21:12:10.546 TaskSetManager: INFO: Starting task 3.0 in stage 36.0 (TID 189) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:10.546 TaskSetManager: INFO: Finished task 2.0 in stage 36.0 (TID 188) in 152 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:10.547 Executor: INFO: Running task 3.0 in stage 36.0 (TID 189) 2023-04-22 21:12:10.572 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:10.700 Executor: INFO: Finished task 3.0 in stage 36.0 (TID 189). 1197 bytes result sent to driver 2023-04-22 21:12:10.700 TaskSetManager: INFO: Starting task 4.0 in stage 36.0 (TID 190) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:10.700 TaskSetManager: INFO: Finished task 3.0 in stage 36.0 (TID 189) in 154 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:10.701 Executor: INFO: Running task 4.0 in stage 36.0 (TID 190) 2023-04-22 21:12:10.727 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:10.857 Executor: INFO: Finished task 4.0 in stage 36.0 (TID 190). 1197 bytes result sent to driver 2023-04-22 21:12:10.857 TaskSetManager: INFO: Starting task 5.0 in stage 36.0 (TID 191) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:10.857 TaskSetManager: INFO: Finished task 4.0 in stage 36.0 (TID 190) in 157 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:10.859 Executor: INFO: Running task 5.0 in stage 36.0 (TID 191) 2023-04-22 21:12:10.885 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:11.010 Executor: INFO: Finished task 5.0 in stage 36.0 (TID 191). 1197 bytes result sent to driver 2023-04-22 21:12:11.011 TaskSetManager: INFO: Starting task 6.0 in stage 36.0 (TID 192) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:11.012 TaskSetManager: INFO: Finished task 5.0 in stage 36.0 (TID 191) in 155 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:11.020 Executor: INFO: Running task 6.0 in stage 36.0 (TID 192) 2023-04-22 21:12:11.046 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:11.173 Executor: INFO: Finished task 6.0 in stage 36.0 (TID 192). 1197 bytes result sent to driver 2023-04-22 21:12:11.174 TaskSetManager: INFO: Starting task 7.0 in stage 36.0 (TID 193) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:11.174 TaskSetManager: INFO: Finished task 6.0 in stage 36.0 (TID 192) in 163 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:11.177 Executor: INFO: Running task 7.0 in stage 36.0 (TID 193) 2023-04-22 21:12:11.203 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:11.326 Executor: INFO: Finished task 7.0 in stage 36.0 (TID 193). 1197 bytes result sent to driver 2023-04-22 21:12:11.332 TaskSetManager: INFO: Finished task 7.0 in stage 36.0 (TID 193) in 159 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:11.332 TaskSchedulerImpl: INFO: Removed TaskSet 36.0, whose tasks have all completed, from pool 2023-04-22 21:12:11.333 DAGScheduler: INFO: ShuffleMapStage 36 (treeAggregate at RowMatrix.scala:94) finished in 1.422 s 2023-04-22 21:12:11.333 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:11.333 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:11.333 DAGScheduler: INFO: waiting: Set(ResultStage 37) 2023-04-22 21:12:11.333 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:11.333 DAGScheduler: INFO: Submitting ResultStage 37 (MapPartitionsRDD[86] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:11.374 MemoryStore: INFO: Block broadcast_77 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:11.383 MemoryStore: INFO: Block broadcast_77_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:11.385 BlockManagerInfo: INFO: Added broadcast_77_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.386 SparkContext: INFO: Created broadcast 77 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:11.386 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 37 (MapPartitionsRDD[86] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:11.386 TaskSchedulerImpl: INFO: Adding task set 37.0 with 2 tasks resource profile 0 2023-04-22 21:12:11.387 TaskSetManager: INFO: Starting task 0.0 in stage 37.0 (TID 194) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:11.387 Executor: INFO: Running task 0.0 in stage 37.0 (TID 194) 2023-04-22 21:12:11.413 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:11.413 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:11.417 Executor: INFO: Finished task 0.0 in stage 37.0 (TID 194). 34646 bytes result sent to driver 2023-04-22 21:12:11.420 TaskSetManager: INFO: Starting task 1.0 in stage 37.0 (TID 195) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:11.420 TaskSetManager: INFO: Finished task 0.0 in stage 37.0 (TID 194) in 34 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:11.421 Executor: INFO: Running task 1.0 in stage 37.0 (TID 195) 2023-04-22 21:12:11.534 BlockManagerInfo: INFO: Removed broadcast_66_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.538 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:11.538 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:11.547 Executor: INFO: Finished task 1.0 in stage 37.0 (TID 195). 34689 bytes result sent to driver 2023-04-22 21:12:11.548 TaskSetManager: INFO: Finished task 1.0 in stage 37.0 (TID 195) in 128 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:11.548 TaskSchedulerImpl: INFO: Removed TaskSet 37.0, whose tasks have all completed, from pool 2023-04-22 21:12:11.548 DAGScheduler: INFO: ResultStage 37 (treeAggregate at RowMatrix.scala:94) finished in 0.215 s 2023-04-22 21:12:11.548 DAGScheduler: INFO: Job 19 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:11.548 TaskSchedulerImpl: INFO: Killing all running tasks in stage 37: Stage finished 2023-04-22 21:12:11.548 DAGScheduler: INFO: Job 19 finished: treeAggregate at RowMatrix.scala:94, took 1.642321 s 2023-04-22 21:12:11.551 MemoryStore: INFO: Block broadcast_78 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:11.553 MemoryStore: INFO: Block broadcast_78_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:11.556 BlockManagerInfo: INFO: Added broadcast_78_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.559 SparkContext: INFO: Created broadcast 78 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:11.615 BlockManagerInfo: INFO: Removed broadcast_65_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.641 BlockManagerInfo: INFO: Removed broadcast_64_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.654 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:11.655 DAGScheduler: INFO: Registering RDD 88 (treeAggregate at RowMatrix.scala:94) as input to shuffle 18 2023-04-22 21:12:11.655 DAGScheduler: INFO: Got job 20 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:11.655 DAGScheduler: INFO: Final stage: ResultStage 39 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:11.655 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 38) 2023-04-22 21:12:11.655 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 38) 2023-04-22 21:12:11.660 DAGScheduler: INFO: Submitting ShuffleMapStage 38 (MapPartitionsRDD[88] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:11.689 BlockManagerInfo: INFO: Removed broadcast_60_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.698 MemoryStore: INFO: Block broadcast_79 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:11.721 BlockManagerInfo: INFO: Removed broadcast_62_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.726 MemoryStore: INFO: Block broadcast_79_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:11.726 BlockManagerInfo: INFO: Added broadcast_79_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.726 SparkContext: INFO: Created broadcast 79 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:11.727 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 38 (MapPartitionsRDD[88] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:11.727 TaskSchedulerImpl: INFO: Adding task set 38.0 with 8 tasks resource profile 0 2023-04-22 21:12:11.728 TaskSetManager: INFO: Starting task 0.0 in stage 38.0 (TID 196) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:11.728 Executor: INFO: Running task 0.0 in stage 38.0 (TID 196) 2023-04-22 21:12:11.758 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:11.821 BlockManagerInfo: INFO: Removed broadcast_63_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.848 BlockManagerInfo: INFO: Removed broadcast_69_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.862 BlockManagerInfo: INFO: Removed broadcast_68_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.864 BlockManagerInfo: INFO: Removed broadcast_61_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.864 BlockManagerInfo: INFO: Removed broadcast_76_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.873 BlockManagerInfo: INFO: Removed broadcast_73_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.873 BlockManagerInfo: INFO: Removed broadcast_67_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.874 BlockManagerInfo: INFO: Removed broadcast_74_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.875 BlockManagerInfo: INFO: Removed broadcast_70_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.876 BlockManagerInfo: INFO: Removed broadcast_72_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.877 BlockManagerInfo: INFO: Removed broadcast_71_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:11.943 Executor: INFO: Finished task 0.0 in stage 38.0 (TID 196). 1197 bytes result sent to driver 2023-04-22 21:12:11.943 TaskSetManager: INFO: Starting task 1.0 in stage 38.0 (TID 197) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:11.943 TaskSetManager: INFO: Finished task 0.0 in stage 38.0 (TID 196) in 216 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:11.950 Executor: INFO: Running task 1.0 in stage 38.0 (TID 197) 2023-04-22 21:12:11.977 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:12.104 Executor: INFO: Finished task 1.0 in stage 38.0 (TID 197). 1197 bytes result sent to driver 2023-04-22 21:12:12.104 TaskSetManager: INFO: Starting task 2.0 in stage 38.0 (TID 198) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:12.105 TaskSetManager: INFO: Finished task 1.0 in stage 38.0 (TID 197) in 162 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:12.108 Executor: INFO: Running task 2.0 in stage 38.0 (TID 198) 2023-04-22 21:12:12.134 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:12.256 Executor: INFO: Finished task 2.0 in stage 38.0 (TID 198). 1197 bytes result sent to driver 2023-04-22 21:12:12.256 TaskSetManager: INFO: Starting task 3.0 in stage 38.0 (TID 199) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:12.257 TaskSetManager: INFO: Finished task 2.0 in stage 38.0 (TID 198) in 153 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:12.257 Executor: INFO: Running task 3.0 in stage 38.0 (TID 199) 2023-04-22 21:12:12.283 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:12.408 Executor: INFO: Finished task 3.0 in stage 38.0 (TID 199). 1197 bytes result sent to driver 2023-04-22 21:12:12.408 TaskSetManager: INFO: Starting task 4.0 in stage 38.0 (TID 200) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:12.409 TaskSetManager: INFO: Finished task 3.0 in stage 38.0 (TID 199) in 153 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:12.417 Executor: INFO: Running task 4.0 in stage 38.0 (TID 200) 2023-04-22 21:12:12.442 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:12.571 Executor: INFO: Finished task 4.0 in stage 38.0 (TID 200). 1197 bytes result sent to driver 2023-04-22 21:12:12.572 TaskSetManager: INFO: Starting task 5.0 in stage 38.0 (TID 201) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:12.572 TaskSetManager: INFO: Finished task 4.0 in stage 38.0 (TID 200) in 164 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:12.572 Executor: INFO: Running task 5.0 in stage 38.0 (TID 201) 2023-04-22 21:12:12.598 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:12.723 Executor: INFO: Finished task 5.0 in stage 38.0 (TID 201). 1197 bytes result sent to driver 2023-04-22 21:12:12.723 TaskSetManager: INFO: Starting task 6.0 in stage 38.0 (TID 202) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:12.723 TaskSetManager: INFO: Finished task 5.0 in stage 38.0 (TID 201) in 151 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:12.724 Executor: INFO: Running task 6.0 in stage 38.0 (TID 202) 2023-04-22 21:12:12.751 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:12.892 Executor: INFO: Finished task 6.0 in stage 38.0 (TID 202). 1197 bytes result sent to driver 2023-04-22 21:12:12.893 TaskSetManager: INFO: Starting task 7.0 in stage 38.0 (TID 203) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:12.893 TaskSetManager: INFO: Finished task 6.0 in stage 38.0 (TID 202) in 170 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:12.894 Executor: INFO: Running task 7.0 in stage 38.0 (TID 203) 2023-04-22 21:12:12.919 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:13.047 Executor: INFO: Finished task 7.0 in stage 38.0 (TID 203). 1197 bytes result sent to driver 2023-04-22 21:12:13.048 TaskSetManager: INFO: Finished task 7.0 in stage 38.0 (TID 203) in 155 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:13.048 TaskSchedulerImpl: INFO: Removed TaskSet 38.0, whose tasks have all completed, from pool 2023-04-22 21:12:13.048 DAGScheduler: INFO: ShuffleMapStage 38 (treeAggregate at RowMatrix.scala:94) finished in 1.388 s 2023-04-22 21:12:13.048 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:13.048 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:13.048 DAGScheduler: INFO: waiting: Set(ResultStage 39) 2023-04-22 21:12:13.048 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:13.048 DAGScheduler: INFO: Submitting ResultStage 39 (MapPartitionsRDD[90] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:13.102 MemoryStore: INFO: Block broadcast_80 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:13.108 MemoryStore: INFO: Block broadcast_80_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:13.112 BlockManagerInfo: INFO: Added broadcast_80_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:13.116 SparkContext: INFO: Created broadcast 80 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:13.118 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 39 (MapPartitionsRDD[90] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:13.118 TaskSchedulerImpl: INFO: Adding task set 39.0 with 2 tasks resource profile 0 2023-04-22 21:12:13.118 TaskSetManager: INFO: Starting task 0.0 in stage 39.0 (TID 204) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:13.119 Executor: INFO: Running task 0.0 in stage 39.0 (TID 204) 2023-04-22 21:12:13.151 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:13.151 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:13.156 Executor: INFO: Finished task 0.0 in stage 39.0 (TID 204). 34646 bytes result sent to driver 2023-04-22 21:12:13.159 TaskSetManager: INFO: Starting task 1.0 in stage 39.0 (TID 205) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:13.159 TaskSetManager: INFO: Finished task 0.0 in stage 39.0 (TID 204) in 41 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:13.160 Executor: INFO: Running task 1.0 in stage 39.0 (TID 205) 2023-04-22 21:12:13.185 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:13.185 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:13.190 Executor: INFO: Finished task 1.0 in stage 39.0 (TID 205). 34646 bytes result sent to driver 2023-04-22 21:12:13.200 TaskSetManager: INFO: Finished task 1.0 in stage 39.0 (TID 205) in 41 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:13.200 TaskSchedulerImpl: INFO: Removed TaskSet 39.0, whose tasks have all completed, from pool 2023-04-22 21:12:13.200 DAGScheduler: INFO: ResultStage 39 (treeAggregate at RowMatrix.scala:94) finished in 0.151 s 2023-04-22 21:12:13.200 DAGScheduler: INFO: Job 20 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:13.200 TaskSchedulerImpl: INFO: Killing all running tasks in stage 39: Stage finished 2023-04-22 21:12:13.201 DAGScheduler: INFO: Job 20 finished: treeAggregate at RowMatrix.scala:94, took 1.546705 s 2023-04-22 21:12:13.203 MemoryStore: INFO: Block broadcast_81 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:13.205 MemoryStore: INFO: Block broadcast_81_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:13.205 BlockManagerInfo: INFO: Added broadcast_81_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:13.207 SparkContext: INFO: Created broadcast 81 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:13.285 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:13.286 DAGScheduler: INFO: Registering RDD 92 (treeAggregate at RowMatrix.scala:94) as input to shuffle 19 2023-04-22 21:12:13.286 DAGScheduler: INFO: Got job 21 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:13.286 DAGScheduler: INFO: Final stage: ResultStage 41 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:13.286 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 40) 2023-04-22 21:12:13.287 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 40) 2023-04-22 21:12:13.290 DAGScheduler: INFO: Submitting ShuffleMapStage 40 (MapPartitionsRDD[92] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:13.325 MemoryStore: INFO: Block broadcast_82 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:13.331 MemoryStore: INFO: Block broadcast_82_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:13.331 BlockManagerInfo: INFO: Added broadcast_82_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:13.332 SparkContext: INFO: Created broadcast 82 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:13.332 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 40 (MapPartitionsRDD[92] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:13.332 TaskSchedulerImpl: INFO: Adding task set 40.0 with 8 tasks resource profile 0 2023-04-22 21:12:13.333 TaskSetManager: INFO: Starting task 0.0 in stage 40.0 (TID 206) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:13.333 Executor: INFO: Running task 0.0 in stage 40.0 (TID 206) 2023-04-22 21:12:13.358 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:13.486 Executor: INFO: Finished task 0.0 in stage 40.0 (TID 206). 1197 bytes result sent to driver 2023-04-22 21:12:13.486 TaskSetManager: INFO: Starting task 1.0 in stage 40.0 (TID 207) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:13.486 TaskSetManager: INFO: Finished task 0.0 in stage 40.0 (TID 206) in 154 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:13.489 Executor: INFO: Running task 1.0 in stage 40.0 (TID 207) 2023-04-22 21:12:13.515 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:13.643 Executor: INFO: Finished task 1.0 in stage 40.0 (TID 207). 1197 bytes result sent to driver 2023-04-22 21:12:13.644 TaskSetManager: INFO: Starting task 2.0 in stage 40.0 (TID 208) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:13.644 TaskSetManager: INFO: Finished task 1.0 in stage 40.0 (TID 207) in 158 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:13.644 Executor: INFO: Running task 2.0 in stage 40.0 (TID 208) 2023-04-22 21:12:13.670 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:13.798 Executor: INFO: Finished task 2.0 in stage 40.0 (TID 208). 1197 bytes result sent to driver 2023-04-22 21:12:13.801 TaskSetManager: INFO: Starting task 3.0 in stage 40.0 (TID 209) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:13.803 TaskSetManager: INFO: Finished task 2.0 in stage 40.0 (TID 208) in 160 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:13.804 Executor: INFO: Running task 3.0 in stage 40.0 (TID 209) 2023-04-22 21:12:13.829 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:13.957 Executor: INFO: Finished task 3.0 in stage 40.0 (TID 209). 1197 bytes result sent to driver 2023-04-22 21:12:13.957 TaskSetManager: INFO: Starting task 4.0 in stage 40.0 (TID 210) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:13.957 TaskSetManager: INFO: Finished task 3.0 in stage 40.0 (TID 209) in 157 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:13.958 Executor: INFO: Running task 4.0 in stage 40.0 (TID 210) 2023-04-22 21:12:13.983 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:14.135 Executor: INFO: Finished task 4.0 in stage 40.0 (TID 210). 1197 bytes result sent to driver 2023-04-22 21:12:14.136 TaskSetManager: INFO: Starting task 5.0 in stage 40.0 (TID 211) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:14.136 TaskSetManager: INFO: Finished task 4.0 in stage 40.0 (TID 210) in 179 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:14.136 Executor: INFO: Running task 5.0 in stage 40.0 (TID 211) 2023-04-22 21:12:14.162 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:14.296 Executor: INFO: Finished task 5.0 in stage 40.0 (TID 211). 1197 bytes result sent to driver 2023-04-22 21:12:14.297 TaskSetManager: INFO: Starting task 6.0 in stage 40.0 (TID 212) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:14.297 TaskSetManager: INFO: Finished task 5.0 in stage 40.0 (TID 211) in 162 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:14.298 Executor: INFO: Running task 6.0 in stage 40.0 (TID 212) 2023-04-22 21:12:14.323 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:14.445 Executor: INFO: Finished task 6.0 in stage 40.0 (TID 212). 1197 bytes result sent to driver 2023-04-22 21:12:14.445 TaskSetManager: INFO: Starting task 7.0 in stage 40.0 (TID 213) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:14.446 TaskSetManager: INFO: Finished task 6.0 in stage 40.0 (TID 212) in 149 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:14.446 Executor: INFO: Running task 7.0 in stage 40.0 (TID 213) 2023-04-22 21:12:14.472 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:14.599 Executor: INFO: Finished task 7.0 in stage 40.0 (TID 213). 1197 bytes result sent to driver 2023-04-22 21:12:14.600 TaskSetManager: INFO: Finished task 7.0 in stage 40.0 (TID 213) in 155 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:14.600 TaskSchedulerImpl: INFO: Removed TaskSet 40.0, whose tasks have all completed, from pool 2023-04-22 21:12:14.601 DAGScheduler: INFO: ShuffleMapStage 40 (treeAggregate at RowMatrix.scala:94) finished in 1.310 s 2023-04-22 21:12:14.601 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:14.601 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:14.601 DAGScheduler: INFO: waiting: Set(ResultStage 41) 2023-04-22 21:12:14.601 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:14.601 DAGScheduler: INFO: Submitting ResultStage 41 (MapPartitionsRDD[94] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:14.641 MemoryStore: INFO: Block broadcast_83 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:14.651 MemoryStore: INFO: Block broadcast_83_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:14.651 BlockManagerInfo: INFO: Added broadcast_83_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:14.652 SparkContext: INFO: Created broadcast 83 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:14.652 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 41 (MapPartitionsRDD[94] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:14.652 TaskSchedulerImpl: INFO: Adding task set 41.0 with 2 tasks resource profile 0 2023-04-22 21:12:14.653 TaskSetManager: INFO: Starting task 0.0 in stage 41.0 (TID 214) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:14.653 Executor: INFO: Running task 0.0 in stage 41.0 (TID 214) 2023-04-22 21:12:14.679 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:14.679 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:14.683 Executor: INFO: Finished task 0.0 in stage 41.0 (TID 214). 34646 bytes result sent to driver 2023-04-22 21:12:14.690 TaskSetManager: INFO: Starting task 1.0 in stage 41.0 (TID 215) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:14.691 TaskSetManager: INFO: Finished task 0.0 in stage 41.0 (TID 214) in 38 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:14.693 Executor: INFO: Running task 1.0 in stage 41.0 (TID 215) 2023-04-22 21:12:14.718 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:14.718 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:14.722 Executor: INFO: Finished task 1.0 in stage 41.0 (TID 215). 34646 bytes result sent to driver 2023-04-22 21:12:14.730 TaskSetManager: INFO: Finished task 1.0 in stage 41.0 (TID 215) in 40 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:14.730 TaskSchedulerImpl: INFO: Removed TaskSet 41.0, whose tasks have all completed, from pool 2023-04-22 21:12:14.730 DAGScheduler: INFO: ResultStage 41 (treeAggregate at RowMatrix.scala:94) finished in 0.129 s 2023-04-22 21:12:14.730 DAGScheduler: INFO: Job 21 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:14.730 TaskSchedulerImpl: INFO: Killing all running tasks in stage 41: Stage finished 2023-04-22 21:12:14.731 DAGScheduler: INFO: Job 21 finished: treeAggregate at RowMatrix.scala:94, took 1.445264 s 2023-04-22 21:12:14.764 MemoryStore: INFO: Block broadcast_84 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:14.785 MemoryStore: INFO: Block broadcast_84_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:14.785 BlockManagerInfo: INFO: Added broadcast_84_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:14.787 SparkContext: INFO: Created broadcast 84 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:14.860 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:14.861 DAGScheduler: INFO: Registering RDD 96 (treeAggregate at RowMatrix.scala:94) as input to shuffle 20 2023-04-22 21:12:14.862 DAGScheduler: INFO: Got job 22 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:14.862 DAGScheduler: INFO: Final stage: ResultStage 43 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:14.862 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 42) 2023-04-22 21:12:14.862 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 42) 2023-04-22 21:12:14.863 DAGScheduler: INFO: Submitting ShuffleMapStage 42 (MapPartitionsRDD[96] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:14.899 MemoryStore: INFO: Block broadcast_85 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:14.905 MemoryStore: INFO: Block broadcast_85_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:14.906 BlockManagerInfo: INFO: Added broadcast_85_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:14.907 SparkContext: INFO: Created broadcast 85 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:14.907 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 42 (MapPartitionsRDD[96] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:14.907 TaskSchedulerImpl: INFO: Adding task set 42.0 with 8 tasks resource profile 0 2023-04-22 21:12:14.908 TaskSetManager: INFO: Starting task 0.0 in stage 42.0 (TID 216) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:14.915 Executor: INFO: Running task 0.0 in stage 42.0 (TID 216) 2023-04-22 21:12:14.941 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:15.069 Executor: INFO: Finished task 0.0 in stage 42.0 (TID 216). 1197 bytes result sent to driver 2023-04-22 21:12:15.080 TaskSetManager: INFO: Starting task 1.0 in stage 42.0 (TID 217) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:15.081 TaskSetManager: INFO: Finished task 0.0 in stage 42.0 (TID 216) in 172 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:15.087 Executor: INFO: Running task 1.0 in stage 42.0 (TID 217) 2023-04-22 21:12:15.113 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:15.251 Executor: INFO: Finished task 1.0 in stage 42.0 (TID 217). 1197 bytes result sent to driver 2023-04-22 21:12:15.252 TaskSetManager: INFO: Starting task 2.0 in stage 42.0 (TID 218) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:15.252 TaskSetManager: INFO: Finished task 1.0 in stage 42.0 (TID 217) in 172 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:15.256 Executor: INFO: Running task 2.0 in stage 42.0 (TID 218) 2023-04-22 21:12:15.282 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:15.411 Executor: INFO: Finished task 2.0 in stage 42.0 (TID 218). 1197 bytes result sent to driver 2023-04-22 21:12:15.411 TaskSetManager: INFO: Starting task 3.0 in stage 42.0 (TID 219) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:15.412 TaskSetManager: INFO: Finished task 2.0 in stage 42.0 (TID 218) in 160 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:15.433 Executor: INFO: Running task 3.0 in stage 42.0 (TID 219) 2023-04-22 21:12:15.464 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:15.590 Executor: INFO: Finished task 3.0 in stage 42.0 (TID 219). 1197 bytes result sent to driver 2023-04-22 21:12:15.592 TaskSetManager: INFO: Starting task 4.0 in stage 42.0 (TID 220) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:15.592 TaskSetManager: INFO: Finished task 3.0 in stage 42.0 (TID 219) in 181 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:15.600 Executor: INFO: Running task 4.0 in stage 42.0 (TID 220) 2023-04-22 21:12:15.625 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:15.754 Executor: INFO: Finished task 4.0 in stage 42.0 (TID 220). 1197 bytes result sent to driver 2023-04-22 21:12:15.754 TaskSetManager: INFO: Starting task 5.0 in stage 42.0 (TID 221) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:15.757 TaskSetManager: INFO: Finished task 4.0 in stage 42.0 (TID 220) in 165 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:15.758 Executor: INFO: Running task 5.0 in stage 42.0 (TID 221) 2023-04-22 21:12:15.784 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:15.908 Executor: INFO: Finished task 5.0 in stage 42.0 (TID 221). 1197 bytes result sent to driver 2023-04-22 21:12:15.908 TaskSetManager: INFO: Starting task 6.0 in stage 42.0 (TID 222) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:15.909 TaskSetManager: INFO: Finished task 5.0 in stage 42.0 (TID 221) in 155 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:15.909 Executor: INFO: Running task 6.0 in stage 42.0 (TID 222) 2023-04-22 21:12:15.934 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:16.057 Executor: INFO: Finished task 6.0 in stage 42.0 (TID 222). 1197 bytes result sent to driver 2023-04-22 21:12:16.058 TaskSetManager: INFO: Starting task 7.0 in stage 42.0 (TID 223) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:16.061 TaskSetManager: INFO: Finished task 6.0 in stage 42.0 (TID 222) in 153 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:16.061 Executor: INFO: Running task 7.0 in stage 42.0 (TID 223) 2023-04-22 21:12:16.087 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:16.228 Executor: INFO: Finished task 7.0 in stage 42.0 (TID 223). 1197 bytes result sent to driver 2023-04-22 21:12:16.228 TaskSetManager: INFO: Finished task 7.0 in stage 42.0 (TID 223) in 170 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:16.228 TaskSchedulerImpl: INFO: Removed TaskSet 42.0, whose tasks have all completed, from pool 2023-04-22 21:12:16.229 DAGScheduler: INFO: ShuffleMapStage 42 (treeAggregate at RowMatrix.scala:94) finished in 1.365 s 2023-04-22 21:12:16.229 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:16.229 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:16.229 DAGScheduler: INFO: waiting: Set(ResultStage 43) 2023-04-22 21:12:16.229 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:16.229 DAGScheduler: INFO: Submitting ResultStage 43 (MapPartitionsRDD[98] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:16.269 MemoryStore: INFO: Block broadcast_86 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:16.275 MemoryStore: INFO: Block broadcast_86_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:16.276 BlockManagerInfo: INFO: Added broadcast_86_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:16.276 SparkContext: INFO: Created broadcast 86 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:16.276 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 43 (MapPartitionsRDD[98] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:16.276 TaskSchedulerImpl: INFO: Adding task set 43.0 with 2 tasks resource profile 0 2023-04-22 21:12:16.277 TaskSetManager: INFO: Starting task 0.0 in stage 43.0 (TID 224) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:16.277 Executor: INFO: Running task 0.0 in stage 43.0 (TID 224) 2023-04-22 21:12:16.303 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:16.303 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:16.307 Executor: INFO: Finished task 0.0 in stage 43.0 (TID 224). 34646 bytes result sent to driver 2023-04-22 21:12:16.308 TaskSetManager: INFO: Starting task 1.0 in stage 43.0 (TID 225) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:16.308 TaskSetManager: INFO: Finished task 0.0 in stage 43.0 (TID 224) in 31 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:16.309 Executor: INFO: Running task 1.0 in stage 43.0 (TID 225) 2023-04-22 21:12:16.334 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:16.334 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:16.338 Executor: INFO: Finished task 1.0 in stage 43.0 (TID 225). 34646 bytes result sent to driver 2023-04-22 21:12:16.343 TaskSetManager: INFO: Finished task 1.0 in stage 43.0 (TID 225) in 35 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:16.343 TaskSchedulerImpl: INFO: Removed TaskSet 43.0, whose tasks have all completed, from pool 2023-04-22 21:12:16.343 DAGScheduler: INFO: ResultStage 43 (treeAggregate at RowMatrix.scala:94) finished in 0.114 s 2023-04-22 21:12:16.344 DAGScheduler: INFO: Job 22 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:16.344 TaskSchedulerImpl: INFO: Killing all running tasks in stage 43: Stage finished 2023-04-22 21:12:16.344 DAGScheduler: INFO: Job 22 finished: treeAggregate at RowMatrix.scala:94, took 1.484390 s 2023-04-22 21:12:16.346 MemoryStore: INFO: Block broadcast_87 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:16.347 MemoryStore: INFO: Block broadcast_87_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:16.348 BlockManagerInfo: INFO: Added broadcast_87_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:16.349 SparkContext: INFO: Created broadcast 87 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:16.412 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:16.414 DAGScheduler: INFO: Registering RDD 100 (treeAggregate at RowMatrix.scala:94) as input to shuffle 21 2023-04-22 21:12:16.414 DAGScheduler: INFO: Got job 23 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:16.414 DAGScheduler: INFO: Final stage: ResultStage 45 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:16.414 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 44) 2023-04-22 21:12:16.414 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 44) 2023-04-22 21:12:16.416 DAGScheduler: INFO: Submitting ShuffleMapStage 44 (MapPartitionsRDD[100] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:16.451 MemoryStore: INFO: Block broadcast_88 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:16.457 MemoryStore: INFO: Block broadcast_88_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:16.457 BlockManagerInfo: INFO: Added broadcast_88_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:16.458 SparkContext: INFO: Created broadcast 88 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:16.459 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 44 (MapPartitionsRDD[100] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:16.459 TaskSchedulerImpl: INFO: Adding task set 44.0 with 8 tasks resource profile 0 2023-04-22 21:12:16.459 TaskSetManager: INFO: Starting task 0.0 in stage 44.0 (TID 226) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:16.460 Executor: INFO: Running task 0.0 in stage 44.0 (TID 226) 2023-04-22 21:12:16.485 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:16.615 Executor: INFO: Finished task 0.0 in stage 44.0 (TID 226). 1197 bytes result sent to driver 2023-04-22 21:12:16.615 TaskSetManager: INFO: Starting task 1.0 in stage 44.0 (TID 227) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:16.615 TaskSetManager: INFO: Finished task 0.0 in stage 44.0 (TID 226) in 156 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:16.616 Executor: INFO: Running task 1.0 in stage 44.0 (TID 227) 2023-04-22 21:12:16.642 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:16.771 Executor: INFO: Finished task 1.0 in stage 44.0 (TID 227). 1197 bytes result sent to driver 2023-04-22 21:12:16.771 TaskSetManager: INFO: Starting task 2.0 in stage 44.0 (TID 228) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:16.771 TaskSetManager: INFO: Finished task 1.0 in stage 44.0 (TID 227) in 156 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:16.772 Executor: INFO: Running task 2.0 in stage 44.0 (TID 228) 2023-04-22 21:12:16.797 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:16.920 Executor: INFO: Finished task 2.0 in stage 44.0 (TID 228). 1197 bytes result sent to driver 2023-04-22 21:12:16.921 TaskSetManager: INFO: Starting task 3.0 in stage 44.0 (TID 229) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:16.924 TaskSetManager: INFO: Finished task 2.0 in stage 44.0 (TID 228) in 153 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:16.924 Executor: INFO: Running task 3.0 in stage 44.0 (TID 229) 2023-04-22 21:12:16.950 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:17.072 Executor: INFO: Finished task 3.0 in stage 44.0 (TID 229). 1197 bytes result sent to driver 2023-04-22 21:12:17.072 TaskSetManager: INFO: Starting task 4.0 in stage 44.0 (TID 230) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:17.073 TaskSetManager: INFO: Finished task 3.0 in stage 44.0 (TID 229) in 152 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:17.074 Executor: INFO: Running task 4.0 in stage 44.0 (TID 230) 2023-04-22 21:12:17.100 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:17.237 Executor: INFO: Finished task 4.0 in stage 44.0 (TID 230). 1197 bytes result sent to driver 2023-04-22 21:12:17.238 TaskSetManager: INFO: Starting task 5.0 in stage 44.0 (TID 231) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:17.238 TaskSetManager: INFO: Finished task 4.0 in stage 44.0 (TID 230) in 166 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:17.239 Executor: INFO: Running task 5.0 in stage 44.0 (TID 231) 2023-04-22 21:12:17.264 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:17.390 Executor: INFO: Finished task 5.0 in stage 44.0 (TID 231). 1197 bytes result sent to driver 2023-04-22 21:12:17.392 TaskSetManager: INFO: Starting task 6.0 in stage 44.0 (TID 232) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:17.394 TaskSetManager: INFO: Finished task 5.0 in stage 44.0 (TID 231) in 157 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:17.396 Executor: INFO: Running task 6.0 in stage 44.0 (TID 232) 2023-04-22 21:12:17.421 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:17.542 Executor: INFO: Finished task 6.0 in stage 44.0 (TID 232). 1197 bytes result sent to driver 2023-04-22 21:12:17.542 TaskSetManager: INFO: Starting task 7.0 in stage 44.0 (TID 233) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:17.542 TaskSetManager: INFO: Finished task 6.0 in stage 44.0 (TID 232) in 150 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:17.543 Executor: INFO: Running task 7.0 in stage 44.0 (TID 233) 2023-04-22 21:12:17.568 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:17.695 Executor: INFO: Finished task 7.0 in stage 44.0 (TID 233). 1197 bytes result sent to driver 2023-04-22 21:12:17.697 TaskSetManager: INFO: Finished task 7.0 in stage 44.0 (TID 233) in 155 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:17.697 TaskSchedulerImpl: INFO: Removed TaskSet 44.0, whose tasks have all completed, from pool 2023-04-22 21:12:17.697 DAGScheduler: INFO: ShuffleMapStage 44 (treeAggregate at RowMatrix.scala:94) finished in 1.281 s 2023-04-22 21:12:17.697 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:17.697 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:17.697 DAGScheduler: INFO: waiting: Set(ResultStage 45) 2023-04-22 21:12:17.697 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:17.697 DAGScheduler: INFO: Submitting ResultStage 45 (MapPartitionsRDD[102] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:17.738 MemoryStore: INFO: Block broadcast_89 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:17.744 MemoryStore: INFO: Block broadcast_89_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:17.745 BlockManagerInfo: INFO: Added broadcast_89_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:17.746 SparkContext: INFO: Created broadcast 89 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:17.746 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 45 (MapPartitionsRDD[102] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:17.746 TaskSchedulerImpl: INFO: Adding task set 45.0 with 2 tasks resource profile 0 2023-04-22 21:12:17.746 TaskSetManager: INFO: Starting task 0.0 in stage 45.0 (TID 234) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:17.747 Executor: INFO: Running task 0.0 in stage 45.0 (TID 234) 2023-04-22 21:12:17.775 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:17.775 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:17.790 Executor: INFO: Finished task 0.0 in stage 45.0 (TID 234). 34646 bytes result sent to driver 2023-04-22 21:12:17.791 TaskSetManager: INFO: Starting task 1.0 in stage 45.0 (TID 235) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:17.792 TaskSetManager: INFO: Finished task 0.0 in stage 45.0 (TID 234) in 46 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:17.793 Executor: INFO: Running task 1.0 in stage 45.0 (TID 235) 2023-04-22 21:12:17.818 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:17.818 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:17.822 Executor: INFO: Finished task 1.0 in stage 45.0 (TID 235). 34646 bytes result sent to driver 2023-04-22 21:12:17.823 TaskSetManager: INFO: Finished task 1.0 in stage 45.0 (TID 235) in 32 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:17.823 TaskSchedulerImpl: INFO: Removed TaskSet 45.0, whose tasks have all completed, from pool 2023-04-22 21:12:17.823 DAGScheduler: INFO: ResultStage 45 (treeAggregate at RowMatrix.scala:94) finished in 0.125 s 2023-04-22 21:12:17.823 DAGScheduler: INFO: Job 23 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:17.824 TaskSchedulerImpl: INFO: Killing all running tasks in stage 45: Stage finished 2023-04-22 21:12:17.824 DAGScheduler: INFO: Job 23 finished: treeAggregate at RowMatrix.scala:94, took 1.411658 s 2023-04-22 21:12:17.826 MemoryStore: INFO: Block broadcast_90 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:17.827 MemoryStore: INFO: Block broadcast_90_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:17.828 BlockManagerInfo: INFO: Added broadcast_90_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:17.841 SparkContext: INFO: Created broadcast 90 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:17.912 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:17.913 DAGScheduler: INFO: Registering RDD 104 (treeAggregate at RowMatrix.scala:94) as input to shuffle 22 2023-04-22 21:12:17.913 DAGScheduler: INFO: Got job 24 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:17.913 DAGScheduler: INFO: Final stage: ResultStage 47 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:17.913 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 46) 2023-04-22 21:12:17.913 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 46) 2023-04-22 21:12:17.914 DAGScheduler: INFO: Submitting ShuffleMapStage 46 (MapPartitionsRDD[104] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:17.952 MemoryStore: INFO: Block broadcast_91 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:17.958 MemoryStore: INFO: Block broadcast_91_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:17.958 BlockManagerInfo: INFO: Added broadcast_91_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:17.959 SparkContext: INFO: Created broadcast 91 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:17.959 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 46 (MapPartitionsRDD[104] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:17.959 TaskSchedulerImpl: INFO: Adding task set 46.0 with 8 tasks resource profile 0 2023-04-22 21:12:17.960 TaskSetManager: INFO: Starting task 0.0 in stage 46.0 (TID 236) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:17.960 Executor: INFO: Running task 0.0 in stage 46.0 (TID 236) 2023-04-22 21:12:17.985 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:18.112 Executor: INFO: Finished task 0.0 in stage 46.0 (TID 236). 1197 bytes result sent to driver 2023-04-22 21:12:18.113 TaskSetManager: INFO: Starting task 1.0 in stage 46.0 (TID 237) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:18.114 TaskSetManager: INFO: Finished task 0.0 in stage 46.0 (TID 236) in 154 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:18.114 Executor: INFO: Running task 1.0 in stage 46.0 (TID 237) 2023-04-22 21:12:18.140 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:18.282 Executor: INFO: Finished task 1.0 in stage 46.0 (TID 237). 1197 bytes result sent to driver 2023-04-22 21:12:18.282 TaskSetManager: INFO: Starting task 2.0 in stage 46.0 (TID 238) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:18.282 TaskSetManager: INFO: Finished task 1.0 in stage 46.0 (TID 237) in 169 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:18.284 Executor: INFO: Running task 2.0 in stage 46.0 (TID 238) 2023-04-22 21:12:18.310 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:18.438 Executor: INFO: Finished task 2.0 in stage 46.0 (TID 238). 1197 bytes result sent to driver 2023-04-22 21:12:18.439 TaskSetManager: INFO: Starting task 3.0 in stage 46.0 (TID 239) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:18.439 TaskSetManager: INFO: Finished task 2.0 in stage 46.0 (TID 238) in 157 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:18.440 Executor: INFO: Running task 3.0 in stage 46.0 (TID 239) 2023-04-22 21:12:18.465 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:18.588 Executor: INFO: Finished task 3.0 in stage 46.0 (TID 239). 1197 bytes result sent to driver 2023-04-22 21:12:18.588 TaskSetManager: INFO: Starting task 4.0 in stage 46.0 (TID 240) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:18.592 TaskSetManager: INFO: Finished task 3.0 in stage 46.0 (TID 239) in 153 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:18.592 Executor: INFO: Running task 4.0 in stage 46.0 (TID 240) 2023-04-22 21:12:18.617 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:18.746 Executor: INFO: Finished task 4.0 in stage 46.0 (TID 240). 1197 bytes result sent to driver 2023-04-22 21:12:18.748 TaskSetManager: INFO: Starting task 5.0 in stage 46.0 (TID 241) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:18.751 TaskSetManager: INFO: Finished task 4.0 in stage 46.0 (TID 240) in 163 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:18.767 Executor: INFO: Running task 5.0 in stage 46.0 (TID 241) 2023-04-22 21:12:18.794 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:18.917 Executor: INFO: Finished task 5.0 in stage 46.0 (TID 241). 1197 bytes result sent to driver 2023-04-22 21:12:18.918 TaskSetManager: INFO: Starting task 6.0 in stage 46.0 (TID 242) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:18.918 TaskSetManager: INFO: Finished task 5.0 in stage 46.0 (TID 241) in 171 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:18.919 Executor: INFO: Running task 6.0 in stage 46.0 (TID 242) 2023-04-22 21:12:18.944 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:19.071 Executor: INFO: Finished task 6.0 in stage 46.0 (TID 242). 1197 bytes result sent to driver 2023-04-22 21:12:19.071 TaskSetManager: INFO: Starting task 7.0 in stage 46.0 (TID 243) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:19.072 TaskSetManager: INFO: Finished task 6.0 in stage 46.0 (TID 242) in 154 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:19.073 Executor: INFO: Running task 7.0 in stage 46.0 (TID 243) 2023-04-22 21:12:19.098 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:19.239 Executor: INFO: Finished task 7.0 in stage 46.0 (TID 243). 1197 bytes result sent to driver 2023-04-22 21:12:19.241 TaskSetManager: INFO: Finished task 7.0 in stage 46.0 (TID 243) in 170 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:19.241 TaskSchedulerImpl: INFO: Removed TaskSet 46.0, whose tasks have all completed, from pool 2023-04-22 21:12:19.241 DAGScheduler: INFO: ShuffleMapStage 46 (treeAggregate at RowMatrix.scala:94) finished in 1.326 s 2023-04-22 21:12:19.241 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:19.241 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:19.241 DAGScheduler: INFO: waiting: Set(ResultStage 47) 2023-04-22 21:12:19.241 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:19.241 DAGScheduler: INFO: Submitting ResultStage 47 (MapPartitionsRDD[106] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:19.280 MemoryStore: INFO: Block broadcast_92 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:19.288 MemoryStore: INFO: Block broadcast_92_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:19.288 BlockManagerInfo: INFO: Added broadcast_92_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:19.288 SparkContext: INFO: Created broadcast 92 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:19.289 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 47 (MapPartitionsRDD[106] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:19.289 TaskSchedulerImpl: INFO: Adding task set 47.0 with 2 tasks resource profile 0 2023-04-22 21:12:19.289 TaskSetManager: INFO: Starting task 0.0 in stage 47.0 (TID 244) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:19.290 Executor: INFO: Running task 0.0 in stage 47.0 (TID 244) 2023-04-22 21:12:19.317 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:19.317 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:19.330 Executor: INFO: Finished task 0.0 in stage 47.0 (TID 244). 34646 bytes result sent to driver 2023-04-22 21:12:19.331 TaskSetManager: INFO: Starting task 1.0 in stage 47.0 (TID 245) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:19.331 TaskSetManager: INFO: Finished task 0.0 in stage 47.0 (TID 244) in 42 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:19.331 Executor: INFO: Running task 1.0 in stage 47.0 (TID 245) 2023-04-22 21:12:19.357 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:19.357 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:19.379 Executor: INFO: Finished task 1.0 in stage 47.0 (TID 245). 34646 bytes result sent to driver 2023-04-22 21:12:19.379 TaskSetManager: INFO: Finished task 1.0 in stage 47.0 (TID 245) in 49 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:19.379 TaskSchedulerImpl: INFO: Removed TaskSet 47.0, whose tasks have all completed, from pool 2023-04-22 21:12:19.380 DAGScheduler: INFO: ResultStage 47 (treeAggregate at RowMatrix.scala:94) finished in 0.138 s 2023-04-22 21:12:19.380 DAGScheduler: INFO: Job 24 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:19.380 TaskSchedulerImpl: INFO: Killing all running tasks in stage 47: Stage finished 2023-04-22 21:12:19.380 DAGScheduler: INFO: Job 24 finished: treeAggregate at RowMatrix.scala:94, took 1.468271 s 2023-04-22 21:12:19.382 MemoryStore: INFO: Block broadcast_93 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:19.384 MemoryStore: INFO: Block broadcast_93_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:19.386 BlockManagerInfo: INFO: Added broadcast_93_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:19.386 SparkContext: INFO: Created broadcast 93 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:19.499 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:19.500 DAGScheduler: INFO: Registering RDD 108 (treeAggregate at RowMatrix.scala:94) as input to shuffle 23 2023-04-22 21:12:19.500 DAGScheduler: INFO: Got job 25 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:19.500 DAGScheduler: INFO: Final stage: ResultStage 49 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:19.500 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 48) 2023-04-22 21:12:19.500 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 48) 2023-04-22 21:12:19.505 DAGScheduler: INFO: Submitting ShuffleMapStage 48 (MapPartitionsRDD[108] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:19.546 MemoryStore: INFO: Block broadcast_94 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:19.552 MemoryStore: INFO: Block broadcast_94_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:19.552 BlockManagerInfo: INFO: Added broadcast_94_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:19.556 SparkContext: INFO: Created broadcast 94 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:19.557 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 48 (MapPartitionsRDD[108] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:19.557 TaskSchedulerImpl: INFO: Adding task set 48.0 with 8 tasks resource profile 0 2023-04-22 21:12:19.558 TaskSetManager: INFO: Starting task 0.0 in stage 48.0 (TID 246) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:19.558 Executor: INFO: Running task 0.0 in stage 48.0 (TID 246) 2023-04-22 21:12:19.583 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:19.706 Executor: INFO: Finished task 0.0 in stage 48.0 (TID 246). 1197 bytes result sent to driver 2023-04-22 21:12:19.707 TaskSetManager: INFO: Starting task 1.0 in stage 48.0 (TID 247) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:19.707 TaskSetManager: INFO: Finished task 0.0 in stage 48.0 (TID 246) in 150 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:19.708 Executor: INFO: Running task 1.0 in stage 48.0 (TID 247) 2023-04-22 21:12:19.733 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:19.858 Executor: INFO: Finished task 1.0 in stage 48.0 (TID 247). 1197 bytes result sent to driver 2023-04-22 21:12:19.858 TaskSetManager: INFO: Starting task 2.0 in stage 48.0 (TID 248) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:19.859 TaskSetManager: INFO: Finished task 1.0 in stage 48.0 (TID 247) in 152 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:19.859 Executor: INFO: Running task 2.0 in stage 48.0 (TID 248) 2023-04-22 21:12:19.884 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:20.007 Executor: INFO: Finished task 2.0 in stage 48.0 (TID 248). 1197 bytes result sent to driver 2023-04-22 21:12:20.008 TaskSetManager: INFO: Starting task 3.0 in stage 48.0 (TID 249) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:20.008 TaskSetManager: INFO: Finished task 2.0 in stage 48.0 (TID 248) in 150 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:20.008 Executor: INFO: Running task 3.0 in stage 48.0 (TID 249) 2023-04-22 21:12:20.033 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:20.161 Executor: INFO: Finished task 3.0 in stage 48.0 (TID 249). 1197 bytes result sent to driver 2023-04-22 21:12:20.162 TaskSetManager: INFO: Starting task 4.0 in stage 48.0 (TID 250) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:20.173 TaskSetManager: INFO: Finished task 3.0 in stage 48.0 (TID 249) in 165 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:20.177 Executor: INFO: Running task 4.0 in stage 48.0 (TID 250) 2023-04-22 21:12:20.205 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:20.350 Executor: INFO: Finished task 4.0 in stage 48.0 (TID 250). 1197 bytes result sent to driver 2023-04-22 21:12:20.351 TaskSetManager: INFO: Starting task 5.0 in stage 48.0 (TID 251) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:20.351 TaskSetManager: INFO: Finished task 4.0 in stage 48.0 (TID 250) in 189 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:20.352 Executor: INFO: Running task 5.0 in stage 48.0 (TID 251) 2023-04-22 21:12:20.381 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:20.505 Executor: INFO: Finished task 5.0 in stage 48.0 (TID 251). 1197 bytes result sent to driver 2023-04-22 21:12:20.506 TaskSetManager: INFO: Starting task 6.0 in stage 48.0 (TID 252) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:20.506 TaskSetManager: INFO: Finished task 5.0 in stage 48.0 (TID 251) in 155 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:20.507 Executor: INFO: Running task 6.0 in stage 48.0 (TID 252) 2023-04-22 21:12:20.591 BlockManagerInfo: INFO: Removed broadcast_81_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.593 BlockManagerInfo: INFO: Removed broadcast_78_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.606 BlockManagerInfo: INFO: Removed broadcast_82_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.607 BlockManagerInfo: INFO: Removed broadcast_84_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.608 BlockManagerInfo: INFO: Removed broadcast_86_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.609 BlockManagerInfo: INFO: Removed broadcast_91_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.613 BlockManagerInfo: INFO: Removed broadcast_83_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.614 BlockManagerInfo: INFO: Removed broadcast_85_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.615 BlockManagerInfo: INFO: Removed broadcast_80_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.617 BlockManagerInfo: INFO: Removed broadcast_90_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.617 BlockManagerInfo: INFO: Removed broadcast_79_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.618 BlockManagerInfo: INFO: Removed broadcast_87_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.619 BlockManagerInfo: INFO: Removed broadcast_88_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.620 BlockManagerInfo: INFO: Removed broadcast_77_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.624 BlockManagerInfo: INFO: Removed broadcast_75_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.627 BlockManagerInfo: INFO: Removed broadcast_89_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.628 BlockManagerInfo: INFO: Removed broadcast_92_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.640 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:20.768 Executor: INFO: Finished task 6.0 in stage 48.0 (TID 252). 1240 bytes result sent to driver 2023-04-22 21:12:20.769 TaskSetManager: INFO: Starting task 7.0 in stage 48.0 (TID 253) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:20.769 TaskSetManager: INFO: Finished task 6.0 in stage 48.0 (TID 252) in 264 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:20.770 Executor: INFO: Running task 7.0 in stage 48.0 (TID 253) 2023-04-22 21:12:20.797 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:20.918 Executor: INFO: Finished task 7.0 in stage 48.0 (TID 253). 1197 bytes result sent to driver 2023-04-22 21:12:20.920 TaskSetManager: INFO: Finished task 7.0 in stage 48.0 (TID 253) in 152 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:20.920 TaskSchedulerImpl: INFO: Removed TaskSet 48.0, whose tasks have all completed, from pool 2023-04-22 21:12:20.921 DAGScheduler: INFO: ShuffleMapStage 48 (treeAggregate at RowMatrix.scala:94) finished in 1.415 s 2023-04-22 21:12:20.921 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:20.921 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:20.921 DAGScheduler: INFO: waiting: Set(ResultStage 49) 2023-04-22 21:12:20.921 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:20.921 DAGScheduler: INFO: Submitting ResultStage 49 (MapPartitionsRDD[110] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:20.962 MemoryStore: INFO: Block broadcast_95 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:20.968 MemoryStore: INFO: Block broadcast_95_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:20.968 BlockManagerInfo: INFO: Added broadcast_95_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:20.969 SparkContext: INFO: Created broadcast 95 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:20.969 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 49 (MapPartitionsRDD[110] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:20.969 TaskSchedulerImpl: INFO: Adding task set 49.0 with 2 tasks resource profile 0 2023-04-22 21:12:20.971 TaskSetManager: INFO: Starting task 0.0 in stage 49.0 (TID 254) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:20.971 Executor: INFO: Running task 0.0 in stage 49.0 (TID 254) 2023-04-22 21:12:21.003 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:21.003 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 1 ms 2023-04-22 21:12:21.011 Executor: INFO: Finished task 0.0 in stage 49.0 (TID 254). 34646 bytes result sent to driver 2023-04-22 21:12:21.012 TaskSetManager: INFO: Starting task 1.0 in stage 49.0 (TID 255) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:21.012 TaskSetManager: INFO: Finished task 0.0 in stage 49.0 (TID 254) in 41 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:21.016 Executor: INFO: Running task 1.0 in stage 49.0 (TID 255) 2023-04-22 21:12:21.074 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:21.074 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 5 ms 2023-04-22 21:12:21.079 Executor: INFO: Finished task 1.0 in stage 49.0 (TID 255). 34646 bytes result sent to driver 2023-04-22 21:12:21.079 TaskSetManager: INFO: Finished task 1.0 in stage 49.0 (TID 255) in 67 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:21.079 TaskSchedulerImpl: INFO: Removed TaskSet 49.0, whose tasks have all completed, from pool 2023-04-22 21:12:21.080 DAGScheduler: INFO: ResultStage 49 (treeAggregate at RowMatrix.scala:94) finished in 0.158 s 2023-04-22 21:12:21.080 DAGScheduler: INFO: Job 25 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:21.080 TaskSchedulerImpl: INFO: Killing all running tasks in stage 49: Stage finished 2023-04-22 21:12:21.080 DAGScheduler: INFO: Job 25 finished: treeAggregate at RowMatrix.scala:94, took 1.580658 s 2023-04-22 21:12:21.082 MemoryStore: INFO: Block broadcast_96 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:21.084 MemoryStore: INFO: Block broadcast_96_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:21.084 BlockManagerInfo: INFO: Added broadcast_96_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:21.084 SparkContext: INFO: Created broadcast 96 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:21.201 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:21.202 DAGScheduler: INFO: Registering RDD 112 (treeAggregate at RowMatrix.scala:94) as input to shuffle 24 2023-04-22 21:12:21.202 DAGScheduler: INFO: Got job 26 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:21.202 DAGScheduler: INFO: Final stage: ResultStage 51 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:21.202 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 50) 2023-04-22 21:12:21.202 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 50) 2023-04-22 21:12:21.206 DAGScheduler: INFO: Submitting ShuffleMapStage 50 (MapPartitionsRDD[112] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:21.264 MemoryStore: INFO: Block broadcast_97 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:21.270 MemoryStore: INFO: Block broadcast_97_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:21.271 BlockManagerInfo: INFO: Added broadcast_97_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:21.287 SparkContext: INFO: Created broadcast 97 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:21.287 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 50 (MapPartitionsRDD[112] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:21.287 TaskSchedulerImpl: INFO: Adding task set 50.0 with 8 tasks resource profile 0 2023-04-22 21:12:21.288 TaskSetManager: INFO: Starting task 0.0 in stage 50.0 (TID 256) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:21.288 Executor: INFO: Running task 0.0 in stage 50.0 (TID 256) 2023-04-22 21:12:21.315 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:21.471 Executor: INFO: Finished task 0.0 in stage 50.0 (TID 256). 1197 bytes result sent to driver 2023-04-22 21:12:21.472 TaskSetManager: INFO: Starting task 1.0 in stage 50.0 (TID 257) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:21.473 TaskSetManager: INFO: Finished task 0.0 in stage 50.0 (TID 256) in 185 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:21.485 Executor: INFO: Running task 1.0 in stage 50.0 (TID 257) 2023-04-22 21:12:21.510 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:21.649 Executor: INFO: Finished task 1.0 in stage 50.0 (TID 257). 1197 bytes result sent to driver 2023-04-22 21:12:21.649 TaskSetManager: INFO: Starting task 2.0 in stage 50.0 (TID 258) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:21.649 TaskSetManager: INFO: Finished task 1.0 in stage 50.0 (TID 257) in 177 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:21.650 Executor: INFO: Running task 2.0 in stage 50.0 (TID 258) 2023-04-22 21:12:21.675 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:21.803 Executor: INFO: Finished task 2.0 in stage 50.0 (TID 258). 1197 bytes result sent to driver 2023-04-22 21:12:21.803 TaskSetManager: INFO: Starting task 3.0 in stage 50.0 (TID 259) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:21.804 TaskSetManager: INFO: Finished task 2.0 in stage 50.0 (TID 258) in 155 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:21.805 Executor: INFO: Running task 3.0 in stage 50.0 (TID 259) 2023-04-22 21:12:21.830 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:21.952 Executor: INFO: Finished task 3.0 in stage 50.0 (TID 259). 1197 bytes result sent to driver 2023-04-22 21:12:21.954 TaskSetManager: INFO: Starting task 4.0 in stage 50.0 (TID 260) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:21.954 TaskSetManager: INFO: Finished task 3.0 in stage 50.0 (TID 259) in 151 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:21.963 Executor: INFO: Running task 4.0 in stage 50.0 (TID 260) 2023-04-22 21:12:21.994 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:22.119 Executor: INFO: Finished task 4.0 in stage 50.0 (TID 260). 1197 bytes result sent to driver 2023-04-22 21:12:22.119 TaskSetManager: INFO: Starting task 5.0 in stage 50.0 (TID 261) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:22.121 TaskSetManager: INFO: Finished task 4.0 in stage 50.0 (TID 260) in 168 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:22.121 Executor: INFO: Running task 5.0 in stage 50.0 (TID 261) 2023-04-22 21:12:22.147 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:22.271 Executor: INFO: Finished task 5.0 in stage 50.0 (TID 261). 1197 bytes result sent to driver 2023-04-22 21:12:22.277 TaskSetManager: INFO: Starting task 6.0 in stage 50.0 (TID 262) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:22.277 TaskSetManager: INFO: Finished task 5.0 in stage 50.0 (TID 261) in 158 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:22.278 Executor: INFO: Running task 6.0 in stage 50.0 (TID 262) 2023-04-22 21:12:22.303 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:22.430 Executor: INFO: Finished task 6.0 in stage 50.0 (TID 262). 1197 bytes result sent to driver 2023-04-22 21:12:22.432 TaskSetManager: INFO: Starting task 7.0 in stage 50.0 (TID 263) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:22.432 TaskSetManager: INFO: Finished task 6.0 in stage 50.0 (TID 262) in 155 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:22.433 Executor: INFO: Running task 7.0 in stage 50.0 (TID 263) 2023-04-22 21:12:22.461 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:22.584 Executor: INFO: Finished task 7.0 in stage 50.0 (TID 263). 1197 bytes result sent to driver 2023-04-22 21:12:22.584 TaskSetManager: INFO: Finished task 7.0 in stage 50.0 (TID 263) in 152 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:22.584 TaskSchedulerImpl: INFO: Removed TaskSet 50.0, whose tasks have all completed, from pool 2023-04-22 21:12:22.584 DAGScheduler: INFO: ShuffleMapStage 50 (treeAggregate at RowMatrix.scala:94) finished in 1.378 s 2023-04-22 21:12:22.585 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:22.585 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:22.585 DAGScheduler: INFO: waiting: Set(ResultStage 51) 2023-04-22 21:12:22.585 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:22.585 DAGScheduler: INFO: Submitting ResultStage 51 (MapPartitionsRDD[114] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:22.636 MemoryStore: INFO: Block broadcast_98 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:22.642 MemoryStore: INFO: Block broadcast_98_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:22.643 BlockManagerInfo: INFO: Added broadcast_98_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:22.643 SparkContext: INFO: Created broadcast 98 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:22.643 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 51 (MapPartitionsRDD[114] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:22.643 TaskSchedulerImpl: INFO: Adding task set 51.0 with 2 tasks resource profile 0 2023-04-22 21:12:22.644 TaskSetManager: INFO: Starting task 0.0 in stage 51.0 (TID 264) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:22.644 Executor: INFO: Running task 0.0 in stage 51.0 (TID 264) 2023-04-22 21:12:22.670 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:22.670 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:22.675 Executor: INFO: Finished task 0.0 in stage 51.0 (TID 264). 34646 bytes result sent to driver 2023-04-22 21:12:22.675 TaskSetManager: INFO: Starting task 1.0 in stage 51.0 (TID 265) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:22.676 TaskSetManager: INFO: Finished task 0.0 in stage 51.0 (TID 264) in 32 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:22.677 Executor: INFO: Running task 1.0 in stage 51.0 (TID 265) 2023-04-22 21:12:22.702 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:22.702 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:22.706 Executor: INFO: Finished task 1.0 in stage 51.0 (TID 265). 34646 bytes result sent to driver 2023-04-22 21:12:22.708 TaskSetManager: INFO: Finished task 1.0 in stage 51.0 (TID 265) in 33 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:22.708 TaskSchedulerImpl: INFO: Removed TaskSet 51.0, whose tasks have all completed, from pool 2023-04-22 21:12:22.708 DAGScheduler: INFO: ResultStage 51 (treeAggregate at RowMatrix.scala:94) finished in 0.123 s 2023-04-22 21:12:22.708 DAGScheduler: INFO: Job 26 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:22.708 TaskSchedulerImpl: INFO: Killing all running tasks in stage 51: Stage finished 2023-04-22 21:12:22.708 DAGScheduler: INFO: Job 26 finished: treeAggregate at RowMatrix.scala:94, took 1.507279 s 2023-04-22 21:12:22.716 MemoryStore: INFO: Block broadcast_99 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:22.722 MemoryStore: INFO: Block broadcast_99_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:22.723 BlockManagerInfo: INFO: Added broadcast_99_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:22.723 SparkContext: INFO: Created broadcast 99 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:22.790 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:22.791 DAGScheduler: INFO: Registering RDD 116 (treeAggregate at RowMatrix.scala:94) as input to shuffle 25 2023-04-22 21:12:22.791 DAGScheduler: INFO: Got job 27 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:22.791 DAGScheduler: INFO: Final stage: ResultStage 53 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:22.791 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 52) 2023-04-22 21:12:22.791 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 52) 2023-04-22 21:12:22.792 DAGScheduler: INFO: Submitting ShuffleMapStage 52 (MapPartitionsRDD[116] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:22.831 MemoryStore: INFO: Block broadcast_100 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:22.837 MemoryStore: INFO: Block broadcast_100_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:22.838 BlockManagerInfo: INFO: Added broadcast_100_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:22.838 SparkContext: INFO: Created broadcast 100 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:22.838 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 52 (MapPartitionsRDD[116] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:22.838 TaskSchedulerImpl: INFO: Adding task set 52.0 with 8 tasks resource profile 0 2023-04-22 21:12:22.839 TaskSetManager: INFO: Starting task 0.0 in stage 52.0 (TID 266) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:22.839 Executor: INFO: Running task 0.0 in stage 52.0 (TID 266) 2023-04-22 21:12:22.866 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:22.989 Executor: INFO: Finished task 0.0 in stage 52.0 (TID 266). 1197 bytes result sent to driver 2023-04-22 21:12:22.990 TaskSetManager: INFO: Starting task 1.0 in stage 52.0 (TID 267) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:22.991 TaskSetManager: INFO: Finished task 0.0 in stage 52.0 (TID 266) in 152 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:23.000 Executor: INFO: Running task 1.0 in stage 52.0 (TID 267) 2023-04-22 21:12:23.025 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:23.151 Executor: INFO: Finished task 1.0 in stage 52.0 (TID 267). 1197 bytes result sent to driver 2023-04-22 21:12:23.152 TaskSetManager: INFO: Starting task 2.0 in stage 52.0 (TID 268) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:23.152 TaskSetManager: INFO: Finished task 1.0 in stage 52.0 (TID 267) in 162 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:23.153 Executor: INFO: Running task 2.0 in stage 52.0 (TID 268) 2023-04-22 21:12:23.178 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:23.301 Executor: INFO: Finished task 2.0 in stage 52.0 (TID 268). 1197 bytes result sent to driver 2023-04-22 21:12:23.301 TaskSetManager: INFO: Starting task 3.0 in stage 52.0 (TID 269) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:23.302 TaskSetManager: INFO: Finished task 2.0 in stage 52.0 (TID 268) in 150 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:23.302 Executor: INFO: Running task 3.0 in stage 52.0 (TID 269) 2023-04-22 21:12:23.328 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:23.456 Executor: INFO: Finished task 3.0 in stage 52.0 (TID 269). 1197 bytes result sent to driver 2023-04-22 21:12:23.456 TaskSetManager: INFO: Starting task 4.0 in stage 52.0 (TID 270) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:23.457 TaskSetManager: INFO: Finished task 3.0 in stage 52.0 (TID 269) in 156 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:23.457 Executor: INFO: Running task 4.0 in stage 52.0 (TID 270) 2023-04-22 21:12:23.483 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:23.625 Executor: INFO: Finished task 4.0 in stage 52.0 (TID 270). 1197 bytes result sent to driver 2023-04-22 21:12:23.625 TaskSetManager: INFO: Starting task 5.0 in stage 52.0 (TID 271) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:23.627 TaskSetManager: INFO: Finished task 4.0 in stage 52.0 (TID 270) in 171 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:23.627 Executor: INFO: Running task 5.0 in stage 52.0 (TID 271) 2023-04-22 21:12:23.655 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:23.780 Executor: INFO: Finished task 5.0 in stage 52.0 (TID 271). 1197 bytes result sent to driver 2023-04-22 21:12:23.780 TaskSetManager: INFO: Starting task 6.0 in stage 52.0 (TID 272) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:23.782 TaskSetManager: INFO: Finished task 5.0 in stage 52.0 (TID 271) in 157 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:23.783 Executor: INFO: Running task 6.0 in stage 52.0 (TID 272) 2023-04-22 21:12:23.811 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:23.932 Executor: INFO: Finished task 6.0 in stage 52.0 (TID 272). 1197 bytes result sent to driver 2023-04-22 21:12:23.932 TaskSetManager: INFO: Starting task 7.0 in stage 52.0 (TID 273) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:23.934 TaskSetManager: INFO: Finished task 6.0 in stage 52.0 (TID 272) in 154 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:23.934 Executor: INFO: Running task 7.0 in stage 52.0 (TID 273) 2023-04-22 21:12:23.960 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:24.083 Executor: INFO: Finished task 7.0 in stage 52.0 (TID 273). 1197 bytes result sent to driver 2023-04-22 21:12:24.084 TaskSetManager: INFO: Finished task 7.0 in stage 52.0 (TID 273) in 152 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:24.084 TaskSchedulerImpl: INFO: Removed TaskSet 52.0, whose tasks have all completed, from pool 2023-04-22 21:12:24.084 DAGScheduler: INFO: ShuffleMapStage 52 (treeAggregate at RowMatrix.scala:94) finished in 1.291 s 2023-04-22 21:12:24.084 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:24.084 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:24.084 DAGScheduler: INFO: waiting: Set(ResultStage 53) 2023-04-22 21:12:24.084 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:24.085 DAGScheduler: INFO: Submitting ResultStage 53 (MapPartitionsRDD[118] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:24.126 MemoryStore: INFO: Block broadcast_101 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:24.132 MemoryStore: INFO: Block broadcast_101_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:24.132 BlockManagerInfo: INFO: Added broadcast_101_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:24.133 SparkContext: INFO: Created broadcast 101 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:24.133 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 53 (MapPartitionsRDD[118] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:24.133 TaskSchedulerImpl: INFO: Adding task set 53.0 with 2 tasks resource profile 0 2023-04-22 21:12:24.134 TaskSetManager: INFO: Starting task 0.0 in stage 53.0 (TID 274) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:24.134 Executor: INFO: Running task 0.0 in stage 53.0 (TID 274) 2023-04-22 21:12:24.159 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:24.159 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:24.163 Executor: INFO: Finished task 0.0 in stage 53.0 (TID 274). 34646 bytes result sent to driver 2023-04-22 21:12:24.164 TaskSetManager: INFO: Starting task 1.0 in stage 53.0 (TID 275) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:24.168 TaskSetManager: INFO: Finished task 0.0 in stage 53.0 (TID 274) in 35 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:24.169 Executor: INFO: Running task 1.0 in stage 53.0 (TID 275) 2023-04-22 21:12:24.194 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:24.194 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:24.198 Executor: INFO: Finished task 1.0 in stage 53.0 (TID 275). 34646 bytes result sent to driver 2023-04-22 21:12:24.201 TaskSetManager: INFO: Finished task 1.0 in stage 53.0 (TID 275) in 37 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:24.201 TaskSchedulerImpl: INFO: Removed TaskSet 53.0, whose tasks have all completed, from pool 2023-04-22 21:12:24.202 DAGScheduler: INFO: ResultStage 53 (treeAggregate at RowMatrix.scala:94) finished in 0.117 s 2023-04-22 21:12:24.202 DAGScheduler: INFO: Job 27 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:24.202 TaskSchedulerImpl: INFO: Killing all running tasks in stage 53: Stage finished 2023-04-22 21:12:24.202 DAGScheduler: INFO: Job 27 finished: treeAggregate at RowMatrix.scala:94, took 1.412571 s 2023-04-22 21:12:24.204 MemoryStore: INFO: Block broadcast_102 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:24.205 MemoryStore: INFO: Block broadcast_102_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:24.206 BlockManagerInfo: INFO: Added broadcast_102_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:24.206 SparkContext: INFO: Created broadcast 102 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:24.277 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:24.277 DAGScheduler: INFO: Registering RDD 120 (treeAggregate at RowMatrix.scala:94) as input to shuffle 26 2023-04-22 21:12:24.278 DAGScheduler: INFO: Got job 28 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:24.278 DAGScheduler: INFO: Final stage: ResultStage 55 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:24.278 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 54) 2023-04-22 21:12:24.278 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 54) 2023-04-22 21:12:24.286 DAGScheduler: INFO: Submitting ShuffleMapStage 54 (MapPartitionsRDD[120] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:24.321 MemoryStore: INFO: Block broadcast_103 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:24.327 MemoryStore: INFO: Block broadcast_103_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:24.328 BlockManagerInfo: INFO: Added broadcast_103_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:24.328 SparkContext: INFO: Created broadcast 103 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:24.328 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 54 (MapPartitionsRDD[120] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:24.328 TaskSchedulerImpl: INFO: Adding task set 54.0 with 8 tasks resource profile 0 2023-04-22 21:12:24.329 TaskSetManager: INFO: Starting task 0.0 in stage 54.0 (TID 276) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:24.329 Executor: INFO: Running task 0.0 in stage 54.0 (TID 276) 2023-04-22 21:12:24.359 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:24.486 Executor: INFO: Finished task 0.0 in stage 54.0 (TID 276). 1197 bytes result sent to driver 2023-04-22 21:12:24.486 TaskSetManager: INFO: Starting task 1.0 in stage 54.0 (TID 277) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:24.486 TaskSetManager: INFO: Finished task 0.0 in stage 54.0 (TID 276) in 157 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:24.487 Executor: INFO: Running task 1.0 in stage 54.0 (TID 277) 2023-04-22 21:12:24.512 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:24.652 Executor: INFO: Finished task 1.0 in stage 54.0 (TID 277). 1197 bytes result sent to driver 2023-04-22 21:12:24.653 TaskSetManager: INFO: Starting task 2.0 in stage 54.0 (TID 278) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:24.653 TaskSetManager: INFO: Finished task 1.0 in stage 54.0 (TID 277) in 167 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:24.680 Executor: INFO: Running task 2.0 in stage 54.0 (TID 278) 2023-04-22 21:12:24.718 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:24.843 Executor: INFO: Finished task 2.0 in stage 54.0 (TID 278). 1197 bytes result sent to driver 2023-04-22 21:12:24.843 TaskSetManager: INFO: Starting task 3.0 in stage 54.0 (TID 279) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:24.844 TaskSetManager: INFO: Finished task 2.0 in stage 54.0 (TID 278) in 191 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:24.845 Executor: INFO: Running task 3.0 in stage 54.0 (TID 279) 2023-04-22 21:12:24.870 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:24.993 Executor: INFO: Finished task 3.0 in stage 54.0 (TID 279). 1197 bytes result sent to driver 2023-04-22 21:12:24.994 TaskSetManager: INFO: Starting task 4.0 in stage 54.0 (TID 280) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:24.994 TaskSetManager: INFO: Finished task 3.0 in stage 54.0 (TID 279) in 151 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:24.994 Executor: INFO: Running task 4.0 in stage 54.0 (TID 280) 2023-04-22 21:12:25.019 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:25.143 Executor: INFO: Finished task 4.0 in stage 54.0 (TID 280). 1197 bytes result sent to driver 2023-04-22 21:12:25.143 TaskSetManager: INFO: Starting task 5.0 in stage 54.0 (TID 281) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:25.144 TaskSetManager: INFO: Finished task 4.0 in stage 54.0 (TID 280) in 151 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:25.145 Executor: INFO: Running task 5.0 in stage 54.0 (TID 281) 2023-04-22 21:12:25.170 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:25.294 Executor: INFO: Finished task 5.0 in stage 54.0 (TID 281). 1197 bytes result sent to driver 2023-04-22 21:12:25.295 TaskSetManager: INFO: Starting task 6.0 in stage 54.0 (TID 282) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:25.296 TaskSetManager: INFO: Finished task 5.0 in stage 54.0 (TID 281) in 153 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:25.298 Executor: INFO: Running task 6.0 in stage 54.0 (TID 282) 2023-04-22 21:12:25.323 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:25.444 Executor: INFO: Finished task 6.0 in stage 54.0 (TID 282). 1197 bytes result sent to driver 2023-04-22 21:12:25.444 TaskSetManager: INFO: Starting task 7.0 in stage 54.0 (TID 283) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:25.445 TaskSetManager: INFO: Finished task 6.0 in stage 54.0 (TID 282) in 150 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:25.447 Executor: INFO: Running task 7.0 in stage 54.0 (TID 283) 2023-04-22 21:12:25.472 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:25.594 Executor: INFO: Finished task 7.0 in stage 54.0 (TID 283). 1197 bytes result sent to driver 2023-04-22 21:12:25.598 TaskSetManager: INFO: Finished task 7.0 in stage 54.0 (TID 283) in 154 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:25.598 TaskSchedulerImpl: INFO: Removed TaskSet 54.0, whose tasks have all completed, from pool 2023-04-22 21:12:25.598 DAGScheduler: INFO: ShuffleMapStage 54 (treeAggregate at RowMatrix.scala:94) finished in 1.312 s 2023-04-22 21:12:25.598 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:25.598 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:25.598 DAGScheduler: INFO: waiting: Set(ResultStage 55) 2023-04-22 21:12:25.598 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:25.599 DAGScheduler: INFO: Submitting ResultStage 55 (MapPartitionsRDD[122] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:25.646 MemoryStore: INFO: Block broadcast_104 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:25.652 MemoryStore: INFO: Block broadcast_104_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:25.654 BlockManagerInfo: INFO: Added broadcast_104_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:25.655 SparkContext: INFO: Created broadcast 104 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:25.655 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 55 (MapPartitionsRDD[122] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:25.655 TaskSchedulerImpl: INFO: Adding task set 55.0 with 2 tasks resource profile 0 2023-04-22 21:12:25.656 TaskSetManager: INFO: Starting task 0.0 in stage 55.0 (TID 284) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:25.656 Executor: INFO: Running task 0.0 in stage 55.0 (TID 284) 2023-04-22 21:12:25.691 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:25.692 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:25.697 Executor: INFO: Finished task 0.0 in stage 55.0 (TID 284). 34646 bytes result sent to driver 2023-04-22 21:12:25.698 TaskSetManager: INFO: Starting task 1.0 in stage 55.0 (TID 285) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:25.698 TaskSetManager: INFO: Finished task 0.0 in stage 55.0 (TID 284) in 42 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:25.699 Executor: INFO: Running task 1.0 in stage 55.0 (TID 285) 2023-04-22 21:12:25.724 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:25.724 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:25.728 Executor: INFO: Finished task 1.0 in stage 55.0 (TID 285). 34646 bytes result sent to driver 2023-04-22 21:12:25.730 TaskSetManager: INFO: Finished task 1.0 in stage 55.0 (TID 285) in 32 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:25.730 TaskSchedulerImpl: INFO: Removed TaskSet 55.0, whose tasks have all completed, from pool 2023-04-22 21:12:25.730 DAGScheduler: INFO: ResultStage 55 (treeAggregate at RowMatrix.scala:94) finished in 0.131 s 2023-04-22 21:12:25.730 DAGScheduler: INFO: Job 28 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:25.730 TaskSchedulerImpl: INFO: Killing all running tasks in stage 55: Stage finished 2023-04-22 21:12:25.731 DAGScheduler: INFO: Job 28 finished: treeAggregate at RowMatrix.scala:94, took 1.453941 s 2023-04-22 21:12:25.732 MemoryStore: INFO: Block broadcast_105 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:25.734 MemoryStore: INFO: Block broadcast_105_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:25.734 BlockManagerInfo: INFO: Added broadcast_105_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:25.735 SparkContext: INFO: Created broadcast 105 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:25.804 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:25.805 DAGScheduler: INFO: Registering RDD 124 (treeAggregate at RowMatrix.scala:94) as input to shuffle 27 2023-04-22 21:12:25.805 DAGScheduler: INFO: Got job 29 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:25.805 DAGScheduler: INFO: Final stage: ResultStage 57 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:25.805 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 56) 2023-04-22 21:12:25.805 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 56) 2023-04-22 21:12:25.807 DAGScheduler: INFO: Submitting ShuffleMapStage 56 (MapPartitionsRDD[124] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:25.843 MemoryStore: INFO: Block broadcast_106 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:25.849 MemoryStore: INFO: Block broadcast_106_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:25.849 BlockManagerInfo: INFO: Added broadcast_106_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:25.850 SparkContext: INFO: Created broadcast 106 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:25.850 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 56 (MapPartitionsRDD[124] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:25.850 TaskSchedulerImpl: INFO: Adding task set 56.0 with 8 tasks resource profile 0 2023-04-22 21:12:25.851 TaskSetManager: INFO: Starting task 0.0 in stage 56.0 (TID 286) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:25.851 Executor: INFO: Running task 0.0 in stage 56.0 (TID 286) 2023-04-22 21:12:25.876 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:26.000 Executor: INFO: Finished task 0.0 in stage 56.0 (TID 286). 1197 bytes result sent to driver 2023-04-22 21:12:26.001 TaskSetManager: INFO: Starting task 1.0 in stage 56.0 (TID 287) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:26.001 TaskSetManager: INFO: Finished task 0.0 in stage 56.0 (TID 286) in 150 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:26.008 Executor: INFO: Running task 1.0 in stage 56.0 (TID 287) 2023-04-22 21:12:26.034 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:26.160 Executor: INFO: Finished task 1.0 in stage 56.0 (TID 287). 1197 bytes result sent to driver 2023-04-22 21:12:26.161 TaskSetManager: INFO: Starting task 2.0 in stage 56.0 (TID 288) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:26.161 TaskSetManager: INFO: Finished task 1.0 in stage 56.0 (TID 287) in 160 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:26.161 Executor: INFO: Running task 2.0 in stage 56.0 (TID 288) 2023-04-22 21:12:26.186 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:26.315 Executor: INFO: Finished task 2.0 in stage 56.0 (TID 288). 1197 bytes result sent to driver 2023-04-22 21:12:26.315 TaskSetManager: INFO: Starting task 3.0 in stage 56.0 (TID 289) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:26.315 TaskSetManager: INFO: Finished task 2.0 in stage 56.0 (TID 288) in 155 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:26.316 Executor: INFO: Running task 3.0 in stage 56.0 (TID 289) 2023-04-22 21:12:26.341 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:26.463 Executor: INFO: Finished task 3.0 in stage 56.0 (TID 289). 1197 bytes result sent to driver 2023-04-22 21:12:26.463 TaskSetManager: INFO: Starting task 4.0 in stage 56.0 (TID 290) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:26.464 TaskSetManager: INFO: Finished task 3.0 in stage 56.0 (TID 289) in 149 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:26.469 Executor: INFO: Running task 4.0 in stage 56.0 (TID 290) 2023-04-22 21:12:26.494 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:26.622 Executor: INFO: Finished task 4.0 in stage 56.0 (TID 290). 1197 bytes result sent to driver 2023-04-22 21:12:26.623 TaskSetManager: INFO: Starting task 5.0 in stage 56.0 (TID 291) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:26.623 TaskSetManager: INFO: Finished task 4.0 in stage 56.0 (TID 290) in 160 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:26.623 Executor: INFO: Running task 5.0 in stage 56.0 (TID 291) 2023-04-22 21:12:26.664 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:26.787 Executor: INFO: Finished task 5.0 in stage 56.0 (TID 291). 1197 bytes result sent to driver 2023-04-22 21:12:26.788 TaskSetManager: INFO: Starting task 6.0 in stage 56.0 (TID 292) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:26.788 TaskSetManager: INFO: Finished task 5.0 in stage 56.0 (TID 291) in 166 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:26.788 Executor: INFO: Running task 6.0 in stage 56.0 (TID 292) 2023-04-22 21:12:26.813 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:26.934 Executor: INFO: Finished task 6.0 in stage 56.0 (TID 292). 1197 bytes result sent to driver 2023-04-22 21:12:26.935 TaskSetManager: INFO: Starting task 7.0 in stage 56.0 (TID 293) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:26.935 TaskSetManager: INFO: Finished task 6.0 in stage 56.0 (TID 292) in 147 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:26.936 Executor: INFO: Running task 7.0 in stage 56.0 (TID 293) 2023-04-22 21:12:26.960 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:27.081 Executor: INFO: Finished task 7.0 in stage 56.0 (TID 293). 1197 bytes result sent to driver 2023-04-22 21:12:27.082 TaskSetManager: INFO: Finished task 7.0 in stage 56.0 (TID 293) in 147 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:27.082 TaskSchedulerImpl: INFO: Removed TaskSet 56.0, whose tasks have all completed, from pool 2023-04-22 21:12:27.082 DAGScheduler: INFO: ShuffleMapStage 56 (treeAggregate at RowMatrix.scala:94) finished in 1.274 s 2023-04-22 21:12:27.082 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:27.082 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:27.082 DAGScheduler: INFO: waiting: Set(ResultStage 57) 2023-04-22 21:12:27.082 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:27.082 DAGScheduler: INFO: Submitting ResultStage 57 (MapPartitionsRDD[126] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:27.122 MemoryStore: INFO: Block broadcast_107 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:27.130 MemoryStore: INFO: Block broadcast_107_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:27.130 BlockManagerInfo: INFO: Added broadcast_107_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:27.132 SparkContext: INFO: Created broadcast 107 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:27.132 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 57 (MapPartitionsRDD[126] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:27.132 TaskSchedulerImpl: INFO: Adding task set 57.0 with 2 tasks resource profile 0 2023-04-22 21:12:27.132 TaskSetManager: INFO: Starting task 0.0 in stage 57.0 (TID 294) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:27.133 Executor: INFO: Running task 0.0 in stage 57.0 (TID 294) 2023-04-22 21:12:27.158 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:27.158 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:27.162 Executor: INFO: Finished task 0.0 in stage 57.0 (TID 294). 34646 bytes result sent to driver 2023-04-22 21:12:27.163 TaskSetManager: INFO: Starting task 1.0 in stage 57.0 (TID 295) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:27.163 TaskSetManager: INFO: Finished task 0.0 in stage 57.0 (TID 294) in 31 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:27.164 Executor: INFO: Running task 1.0 in stage 57.0 (TID 295) 2023-04-22 21:12:27.189 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:27.189 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:27.194 Executor: INFO: Finished task 1.0 in stage 57.0 (TID 295). 34646 bytes result sent to driver 2023-04-22 21:12:27.195 TaskSetManager: INFO: Finished task 1.0 in stage 57.0 (TID 295) in 32 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:27.195 TaskSchedulerImpl: INFO: Removed TaskSet 57.0, whose tasks have all completed, from pool 2023-04-22 21:12:27.195 DAGScheduler: INFO: ResultStage 57 (treeAggregate at RowMatrix.scala:94) finished in 0.112 s 2023-04-22 21:12:27.195 DAGScheduler: INFO: Job 29 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:27.195 TaskSchedulerImpl: INFO: Killing all running tasks in stage 57: Stage finished 2023-04-22 21:12:27.195 DAGScheduler: INFO: Job 29 finished: treeAggregate at RowMatrix.scala:94, took 1.391177 s 2023-04-22 21:12:27.197 MemoryStore: INFO: Block broadcast_108 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:27.199 MemoryStore: INFO: Block broadcast_108_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:27.200 BlockManagerInfo: INFO: Added broadcast_108_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:27.201 SparkContext: INFO: Created broadcast 108 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:27.263 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:27.264 DAGScheduler: INFO: Registering RDD 128 (treeAggregate at RowMatrix.scala:94) as input to shuffle 28 2023-04-22 21:12:27.264 DAGScheduler: INFO: Got job 30 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:27.264 DAGScheduler: INFO: Final stage: ResultStage 59 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:27.264 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 58) 2023-04-22 21:12:27.264 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 58) 2023-04-22 21:12:27.273 DAGScheduler: INFO: Submitting ShuffleMapStage 58 (MapPartitionsRDD[128] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:27.331 MemoryStore: INFO: Block broadcast_109 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:27.346 MemoryStore: INFO: Block broadcast_109_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:27.348 BlockManagerInfo: INFO: Added broadcast_109_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:27.348 SparkContext: INFO: Created broadcast 109 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:27.349 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 58 (MapPartitionsRDD[128] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:27.349 TaskSchedulerImpl: INFO: Adding task set 58.0 with 8 tasks resource profile 0 2023-04-22 21:12:27.349 TaskSetManager: INFO: Starting task 0.0 in stage 58.0 (TID 296) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:27.350 Executor: INFO: Running task 0.0 in stage 58.0 (TID 296) 2023-04-22 21:12:27.376 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:27.499 Executor: INFO: Finished task 0.0 in stage 58.0 (TID 296). 1197 bytes result sent to driver 2023-04-22 21:12:27.499 TaskSetManager: INFO: Starting task 1.0 in stage 58.0 (TID 297) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:27.500 TaskSetManager: INFO: Finished task 0.0 in stage 58.0 (TID 296) in 151 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:27.500 Executor: INFO: Running task 1.0 in stage 58.0 (TID 297) 2023-04-22 21:12:27.526 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:27.650 Executor: INFO: Finished task 1.0 in stage 58.0 (TID 297). 1197 bytes result sent to driver 2023-04-22 21:12:27.651 TaskSetManager: INFO: Starting task 2.0 in stage 58.0 (TID 298) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:27.653 TaskSetManager: INFO: Finished task 1.0 in stage 58.0 (TID 297) in 154 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:27.653 Executor: INFO: Running task 2.0 in stage 58.0 (TID 298) 2023-04-22 21:12:27.687 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:27.816 Executor: INFO: Finished task 2.0 in stage 58.0 (TID 298). 1197 bytes result sent to driver 2023-04-22 21:12:27.816 TaskSetManager: INFO: Starting task 3.0 in stage 58.0 (TID 299) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:27.816 TaskSetManager: INFO: Finished task 2.0 in stage 58.0 (TID 298) in 165 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:27.817 Executor: INFO: Running task 3.0 in stage 58.0 (TID 299) 2023-04-22 21:12:27.842 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:27.972 Executor: INFO: Finished task 3.0 in stage 58.0 (TID 299). 1197 bytes result sent to driver 2023-04-22 21:12:27.972 TaskSetManager: INFO: Starting task 4.0 in stage 58.0 (TID 300) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:27.972 TaskSetManager: INFO: Finished task 3.0 in stage 58.0 (TID 299) in 156 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:27.972 Executor: INFO: Running task 4.0 in stage 58.0 (TID 300) 2023-04-22 21:12:27.998 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:28.140 Executor: INFO: Finished task 4.0 in stage 58.0 (TID 300). 1197 bytes result sent to driver 2023-04-22 21:12:28.142 TaskSetManager: INFO: Starting task 5.0 in stage 58.0 (TID 301) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:28.142 TaskSetManager: INFO: Finished task 4.0 in stage 58.0 (TID 300) in 170 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:28.147 Executor: INFO: Running task 5.0 in stage 58.0 (TID 301) 2023-04-22 21:12:28.172 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:28.297 Executor: INFO: Finished task 5.0 in stage 58.0 (TID 301). 1197 bytes result sent to driver 2023-04-22 21:12:28.297 TaskSetManager: INFO: Starting task 6.0 in stage 58.0 (TID 302) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:28.308 TaskSetManager: INFO: Finished task 5.0 in stage 58.0 (TID 301) in 166 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:28.321 Executor: INFO: Running task 6.0 in stage 58.0 (TID 302) 2023-04-22 21:12:28.359 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:28.480 Executor: INFO: Finished task 6.0 in stage 58.0 (TID 302). 1197 bytes result sent to driver 2023-04-22 21:12:28.481 TaskSetManager: INFO: Starting task 7.0 in stage 58.0 (TID 303) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:28.481 TaskSetManager: INFO: Finished task 6.0 in stage 58.0 (TID 302) in 184 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:28.483 Executor: INFO: Running task 7.0 in stage 58.0 (TID 303) 2023-04-22 21:12:28.508 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:28.636 Executor: INFO: Finished task 7.0 in stage 58.0 (TID 303). 1197 bytes result sent to driver 2023-04-22 21:12:28.637 TaskSetManager: INFO: Finished task 7.0 in stage 58.0 (TID 303) in 156 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:28.637 TaskSchedulerImpl: INFO: Removed TaskSet 58.0, whose tasks have all completed, from pool 2023-04-22 21:12:28.637 DAGScheduler: INFO: ShuffleMapStage 58 (treeAggregate at RowMatrix.scala:94) finished in 1.364 s 2023-04-22 21:12:28.637 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:28.637 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:28.637 DAGScheduler: INFO: waiting: Set(ResultStage 59) 2023-04-22 21:12:28.637 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:28.637 DAGScheduler: INFO: Submitting ResultStage 59 (MapPartitionsRDD[130] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:28.679 MemoryStore: INFO: Block broadcast_110 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:28.685 MemoryStore: INFO: Block broadcast_110_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:28.686 BlockManagerInfo: INFO: Added broadcast_110_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:28.686 SparkContext: INFO: Created broadcast 110 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:28.686 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 59 (MapPartitionsRDD[130] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:28.686 TaskSchedulerImpl: INFO: Adding task set 59.0 with 2 tasks resource profile 0 2023-04-22 21:12:28.689 TaskSetManager: INFO: Starting task 0.0 in stage 59.0 (TID 304) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:28.689 Executor: INFO: Running task 0.0 in stage 59.0 (TID 304) 2023-04-22 21:12:28.714 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:28.714 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:28.719 Executor: INFO: Finished task 0.0 in stage 59.0 (TID 304). 34646 bytes result sent to driver 2023-04-22 21:12:28.719 TaskSetManager: INFO: Starting task 1.0 in stage 59.0 (TID 305) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:28.719 TaskSetManager: INFO: Finished task 0.0 in stage 59.0 (TID 304) in 31 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:28.720 Executor: INFO: Running task 1.0 in stage 59.0 (TID 305) 2023-04-22 21:12:28.745 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:28.745 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:28.750 Executor: INFO: Finished task 1.0 in stage 59.0 (TID 305). 34646 bytes result sent to driver 2023-04-22 21:12:28.754 TaskSetManager: INFO: Finished task 1.0 in stage 59.0 (TID 305) in 35 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:28.754 TaskSchedulerImpl: INFO: Removed TaskSet 59.0, whose tasks have all completed, from pool 2023-04-22 21:12:28.754 DAGScheduler: INFO: ResultStage 59 (treeAggregate at RowMatrix.scala:94) finished in 0.116 s 2023-04-22 21:12:28.754 DAGScheduler: INFO: Job 30 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:28.754 TaskSchedulerImpl: INFO: Killing all running tasks in stage 59: Stage finished 2023-04-22 21:12:28.755 DAGScheduler: INFO: Job 30 finished: treeAggregate at RowMatrix.scala:94, took 1.491544 s 2023-04-22 21:12:28.756 MemoryStore: INFO: Block broadcast_111 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:28.758 MemoryStore: INFO: Block broadcast_111_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:28.758 BlockManagerInfo: INFO: Added broadcast_111_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:28.769 SparkContext: INFO: Created broadcast 111 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:28.853 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:28.853 DAGScheduler: INFO: Registering RDD 132 (treeAggregate at RowMatrix.scala:94) as input to shuffle 29 2023-04-22 21:12:28.854 DAGScheduler: INFO: Got job 31 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:28.854 DAGScheduler: INFO: Final stage: ResultStage 61 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:28.854 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 60) 2023-04-22 21:12:28.854 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 60) 2023-04-22 21:12:28.857 DAGScheduler: INFO: Submitting ShuffleMapStage 60 (MapPartitionsRDD[132] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:28.898 MemoryStore: INFO: Block broadcast_112 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:28.904 MemoryStore: INFO: Block broadcast_112_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:28.905 BlockManagerInfo: INFO: Added broadcast_112_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:28.905 SparkContext: INFO: Created broadcast 112 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:28.905 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 60 (MapPartitionsRDD[132] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:28.905 TaskSchedulerImpl: INFO: Adding task set 60.0 with 8 tasks resource profile 0 2023-04-22 21:12:28.906 TaskSetManager: INFO: Starting task 0.0 in stage 60.0 (TID 306) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:28.906 Executor: INFO: Running task 0.0 in stage 60.0 (TID 306) 2023-04-22 21:12:28.934 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:29.059 Executor: INFO: Finished task 0.0 in stage 60.0 (TID 306). 1197 bytes result sent to driver 2023-04-22 21:12:29.060 TaskSetManager: INFO: Starting task 1.0 in stage 60.0 (TID 307) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:29.060 TaskSetManager: INFO: Finished task 0.0 in stage 60.0 (TID 306) in 154 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:29.064 Executor: INFO: Running task 1.0 in stage 60.0 (TID 307) 2023-04-22 21:12:29.158 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:29.189 BlockManagerInfo: INFO: Removed broadcast_101_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.190 BlockManagerInfo: INFO: Removed broadcast_94_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.191 BlockManagerInfo: INFO: Removed broadcast_96_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.193 BlockManagerInfo: INFO: Removed broadcast_95_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.195 BlockManagerInfo: INFO: Removed broadcast_99_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.197 BlockManagerInfo: INFO: Removed broadcast_100_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.200 BlockManagerInfo: INFO: Removed broadcast_108_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.200 BlockManagerInfo: INFO: Removed broadcast_110_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.201 BlockManagerInfo: INFO: Removed broadcast_93_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.208 BlockManagerInfo: INFO: Removed broadcast_104_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.210 BlockManagerInfo: INFO: Removed broadcast_102_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.211 BlockManagerInfo: INFO: Removed broadcast_97_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.211 BlockManagerInfo: INFO: Removed broadcast_98_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.213 BlockManagerInfo: INFO: Removed broadcast_107_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.214 BlockManagerInfo: INFO: Removed broadcast_109_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.215 BlockManagerInfo: INFO: Removed broadcast_106_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.217 BlockManagerInfo: INFO: Removed broadcast_105_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.221 BlockManagerInfo: INFO: Removed broadcast_103_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:29.339 Executor: INFO: Finished task 1.0 in stage 60.0 (TID 307). 1240 bytes result sent to driver 2023-04-22 21:12:29.340 TaskSetManager: INFO: Starting task 2.0 in stage 60.0 (TID 308) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:29.340 TaskSetManager: INFO: Finished task 1.0 in stage 60.0 (TID 307) in 281 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:29.341 Executor: INFO: Running task 2.0 in stage 60.0 (TID 308) 2023-04-22 21:12:29.368 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:29.504 Executor: INFO: Finished task 2.0 in stage 60.0 (TID 308). 1197 bytes result sent to driver 2023-04-22 21:12:29.504 TaskSetManager: INFO: Starting task 3.0 in stage 60.0 (TID 309) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:29.505 TaskSetManager: INFO: Finished task 2.0 in stage 60.0 (TID 308) in 165 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:29.524 Executor: INFO: Running task 3.0 in stage 60.0 (TID 309) 2023-04-22 21:12:29.553 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:29.676 Executor: INFO: Finished task 3.0 in stage 60.0 (TID 309). 1197 bytes result sent to driver 2023-04-22 21:12:29.676 TaskSetManager: INFO: Starting task 4.0 in stage 60.0 (TID 310) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:29.676 TaskSetManager: INFO: Finished task 3.0 in stage 60.0 (TID 309) in 172 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:29.700 Executor: INFO: Running task 4.0 in stage 60.0 (TID 310) 2023-04-22 21:12:29.734 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:29.864 Executor: INFO: Finished task 4.0 in stage 60.0 (TID 310). 1197 bytes result sent to driver 2023-04-22 21:12:29.864 TaskSetManager: INFO: Starting task 5.0 in stage 60.0 (TID 311) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:29.865 TaskSetManager: INFO: Finished task 4.0 in stage 60.0 (TID 310) in 189 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:29.865 Executor: INFO: Running task 5.0 in stage 60.0 (TID 311) 2023-04-22 21:12:29.891 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:30.015 Executor: INFO: Finished task 5.0 in stage 60.0 (TID 311). 1197 bytes result sent to driver 2023-04-22 21:12:30.015 TaskSetManager: INFO: Starting task 6.0 in stage 60.0 (TID 312) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:30.017 TaskSetManager: INFO: Finished task 5.0 in stage 60.0 (TID 311) in 153 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:30.017 Executor: INFO: Running task 6.0 in stage 60.0 (TID 312) 2023-04-22 21:12:30.043 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:30.186 Executor: INFO: Finished task 6.0 in stage 60.0 (TID 312). 1197 bytes result sent to driver 2023-04-22 21:12:30.186 TaskSetManager: INFO: Starting task 7.0 in stage 60.0 (TID 313) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:30.187 TaskSetManager: INFO: Finished task 6.0 in stage 60.0 (TID 312) in 172 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:30.187 Executor: INFO: Running task 7.0 in stage 60.0 (TID 313) 2023-04-22 21:12:30.213 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:30.335 Executor: INFO: Finished task 7.0 in stage 60.0 (TID 313). 1197 bytes result sent to driver 2023-04-22 21:12:30.336 TaskSetManager: INFO: Finished task 7.0 in stage 60.0 (TID 313) in 150 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:30.336 TaskSchedulerImpl: INFO: Removed TaskSet 60.0, whose tasks have all completed, from pool 2023-04-22 21:12:30.336 DAGScheduler: INFO: ShuffleMapStage 60 (treeAggregate at RowMatrix.scala:94) finished in 1.479 s 2023-04-22 21:12:30.336 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:30.336 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:30.336 DAGScheduler: INFO: waiting: Set(ResultStage 61) 2023-04-22 21:12:30.336 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:30.336 DAGScheduler: INFO: Submitting ResultStage 61 (MapPartitionsRDD[134] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:30.376 MemoryStore: INFO: Block broadcast_113 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:30.382 MemoryStore: INFO: Block broadcast_113_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:30.383 BlockManagerInfo: INFO: Added broadcast_113_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:30.383 SparkContext: INFO: Created broadcast 113 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:30.383 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 61 (MapPartitionsRDD[134] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:30.383 TaskSchedulerImpl: INFO: Adding task set 61.0 with 2 tasks resource profile 0 2023-04-22 21:12:30.385 TaskSetManager: INFO: Starting task 0.0 in stage 61.0 (TID 314) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:30.385 Executor: INFO: Running task 0.0 in stage 61.0 (TID 314) 2023-04-22 21:12:30.411 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:30.411 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:30.416 Executor: INFO: Finished task 0.0 in stage 61.0 (TID 314). 34646 bytes result sent to driver 2023-04-22 21:12:30.417 TaskSetManager: INFO: Starting task 1.0 in stage 61.0 (TID 315) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:30.417 TaskSetManager: INFO: Finished task 0.0 in stage 61.0 (TID 314) in 32 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:30.418 Executor: INFO: Running task 1.0 in stage 61.0 (TID 315) 2023-04-22 21:12:30.443 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:30.443 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:30.447 Executor: INFO: Finished task 1.0 in stage 61.0 (TID 315). 34646 bytes result sent to driver 2023-04-22 21:12:30.452 TaskSetManager: INFO: Finished task 1.0 in stage 61.0 (TID 315) in 34 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:30.452 TaskSchedulerImpl: INFO: Removed TaskSet 61.0, whose tasks have all completed, from pool 2023-04-22 21:12:30.452 DAGScheduler: INFO: ResultStage 61 (treeAggregate at RowMatrix.scala:94) finished in 0.115 s 2023-04-22 21:12:30.452 DAGScheduler: INFO: Job 31 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:30.452 TaskSchedulerImpl: INFO: Killing all running tasks in stage 61: Stage finished 2023-04-22 21:12:30.453 DAGScheduler: INFO: Job 31 finished: treeAggregate at RowMatrix.scala:94, took 1.599574 s 2023-04-22 21:12:30.457 MemoryStore: INFO: Block broadcast_114 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:30.459 MemoryStore: INFO: Block broadcast_114_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:30.459 BlockManagerInfo: INFO: Added broadcast_114_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:30.464 SparkContext: INFO: Created broadcast 114 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:30.549 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:30.558 DAGScheduler: INFO: Registering RDD 136 (treeAggregate at RowMatrix.scala:94) as input to shuffle 30 2023-04-22 21:12:30.558 DAGScheduler: INFO: Got job 32 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:30.558 DAGScheduler: INFO: Final stage: ResultStage 63 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:30.558 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 62) 2023-04-22 21:12:30.558 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 62) 2023-04-22 21:12:30.561 DAGScheduler: INFO: Submitting ShuffleMapStage 62 (MapPartitionsRDD[136] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:30.611 MemoryStore: INFO: Block broadcast_115 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:30.617 MemoryStore: INFO: Block broadcast_115_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:30.618 BlockManagerInfo: INFO: Added broadcast_115_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:30.618 SparkContext: INFO: Created broadcast 115 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:30.618 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 62 (MapPartitionsRDD[136] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:30.618 TaskSchedulerImpl: INFO: Adding task set 62.0 with 8 tasks resource profile 0 2023-04-22 21:12:30.619 TaskSetManager: INFO: Starting task 0.0 in stage 62.0 (TID 316) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:30.620 Executor: INFO: Running task 0.0 in stage 62.0 (TID 316) 2023-04-22 21:12:30.645 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:30.768 Executor: INFO: Finished task 0.0 in stage 62.0 (TID 316). 1197 bytes result sent to driver 2023-04-22 21:12:30.769 TaskSetManager: INFO: Starting task 1.0 in stage 62.0 (TID 317) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:30.769 TaskSetManager: INFO: Finished task 0.0 in stage 62.0 (TID 316) in 150 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:30.785 Executor: INFO: Running task 1.0 in stage 62.0 (TID 317) 2023-04-22 21:12:30.811 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:30.938 Executor: INFO: Finished task 1.0 in stage 62.0 (TID 317). 1197 bytes result sent to driver 2023-04-22 21:12:30.938 TaskSetManager: INFO: Starting task 2.0 in stage 62.0 (TID 318) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:30.938 TaskSetManager: INFO: Finished task 1.0 in stage 62.0 (TID 317) in 169 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:30.939 Executor: INFO: Running task 2.0 in stage 62.0 (TID 318) 2023-04-22 21:12:30.964 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:31.092 Executor: INFO: Finished task 2.0 in stage 62.0 (TID 318). 1197 bytes result sent to driver 2023-04-22 21:12:31.093 TaskSetManager: INFO: Starting task 3.0 in stage 62.0 (TID 319) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:31.093 TaskSetManager: INFO: Finished task 2.0 in stage 62.0 (TID 318) in 155 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:31.095 Executor: INFO: Running task 3.0 in stage 62.0 (TID 319) 2023-04-22 21:12:31.121 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:31.259 Executor: INFO: Finished task 3.0 in stage 62.0 (TID 319). 1197 bytes result sent to driver 2023-04-22 21:12:31.259 TaskSetManager: INFO: Starting task 4.0 in stage 62.0 (TID 320) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:31.259 TaskSetManager: INFO: Finished task 3.0 in stage 62.0 (TID 319) in 166 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:31.263 Executor: INFO: Running task 4.0 in stage 62.0 (TID 320) 2023-04-22 21:12:31.288 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:31.411 Executor: INFO: Finished task 4.0 in stage 62.0 (TID 320). 1197 bytes result sent to driver 2023-04-22 21:12:31.412 TaskSetManager: INFO: Starting task 5.0 in stage 62.0 (TID 321) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:31.412 TaskSetManager: INFO: Finished task 4.0 in stage 62.0 (TID 320) in 153 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:31.415 Executor: INFO: Running task 5.0 in stage 62.0 (TID 321) 2023-04-22 21:12:31.447 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:31.577 Executor: INFO: Finished task 5.0 in stage 62.0 (TID 321). 1197 bytes result sent to driver 2023-04-22 21:12:31.578 TaskSetManager: INFO: Starting task 6.0 in stage 62.0 (TID 322) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:31.578 TaskSetManager: INFO: Finished task 5.0 in stage 62.0 (TID 321) in 166 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:31.578 Executor: INFO: Running task 6.0 in stage 62.0 (TID 322) 2023-04-22 21:12:31.604 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:31.724 Executor: INFO: Finished task 6.0 in stage 62.0 (TID 322). 1197 bytes result sent to driver 2023-04-22 21:12:31.725 TaskSetManager: INFO: Starting task 7.0 in stage 62.0 (TID 323) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:31.725 TaskSetManager: INFO: Finished task 6.0 in stage 62.0 (TID 322) in 148 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:31.725 Executor: INFO: Running task 7.0 in stage 62.0 (TID 323) 2023-04-22 21:12:31.751 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:31.878 Executor: INFO: Finished task 7.0 in stage 62.0 (TID 323). 1197 bytes result sent to driver 2023-04-22 21:12:31.879 TaskSetManager: INFO: Finished task 7.0 in stage 62.0 (TID 323) in 154 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:31.879 TaskSchedulerImpl: INFO: Removed TaskSet 62.0, whose tasks have all completed, from pool 2023-04-22 21:12:31.879 DAGScheduler: INFO: ShuffleMapStage 62 (treeAggregate at RowMatrix.scala:94) finished in 1.318 s 2023-04-22 21:12:31.879 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:31.879 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:31.879 DAGScheduler: INFO: waiting: Set(ResultStage 63) 2023-04-22 21:12:31.879 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:31.880 DAGScheduler: INFO: Submitting ResultStage 63 (MapPartitionsRDD[138] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:31.923 MemoryStore: INFO: Block broadcast_116 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:31.929 MemoryStore: INFO: Block broadcast_116_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:31.930 BlockManagerInfo: INFO: Added broadcast_116_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:31.940 SparkContext: INFO: Created broadcast 116 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:31.940 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 63 (MapPartitionsRDD[138] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:31.940 TaskSchedulerImpl: INFO: Adding task set 63.0 with 2 tasks resource profile 0 2023-04-22 21:12:31.941 TaskSetManager: INFO: Starting task 0.0 in stage 63.0 (TID 324) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:31.941 Executor: INFO: Running task 0.0 in stage 63.0 (TID 324) 2023-04-22 21:12:31.966 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:31.966 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:31.971 Executor: INFO: Finished task 0.0 in stage 63.0 (TID 324). 34646 bytes result sent to driver 2023-04-22 21:12:31.971 TaskSetManager: INFO: Starting task 1.0 in stage 63.0 (TID 325) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:31.972 TaskSetManager: INFO: Finished task 0.0 in stage 63.0 (TID 324) in 31 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:31.972 Executor: INFO: Running task 1.0 in stage 63.0 (TID 325) 2023-04-22 21:12:31.997 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:31.997 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:32.001 Executor: INFO: Finished task 1.0 in stage 63.0 (TID 325). 34646 bytes result sent to driver 2023-04-22 21:12:32.005 TaskSetManager: INFO: Finished task 1.0 in stage 63.0 (TID 325) in 34 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:32.005 TaskSchedulerImpl: INFO: Removed TaskSet 63.0, whose tasks have all completed, from pool 2023-04-22 21:12:32.005 DAGScheduler: INFO: ResultStage 63 (treeAggregate at RowMatrix.scala:94) finished in 0.125 s 2023-04-22 21:12:32.005 DAGScheduler: INFO: Job 32 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:32.006 TaskSchedulerImpl: INFO: Killing all running tasks in stage 63: Stage finished 2023-04-22 21:12:32.006 DAGScheduler: INFO: Job 32 finished: treeAggregate at RowMatrix.scala:94, took 1.456865 s 2023-04-22 21:12:32.008 MemoryStore: INFO: Block broadcast_117 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:32.009 MemoryStore: INFO: Block broadcast_117_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:32.010 BlockManagerInfo: INFO: Added broadcast_117_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:32.021 SparkContext: INFO: Created broadcast 117 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:32.086 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:32.086 DAGScheduler: INFO: Registering RDD 140 (treeAggregate at RowMatrix.scala:94) as input to shuffle 31 2023-04-22 21:12:32.086 DAGScheduler: INFO: Got job 33 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:32.086 DAGScheduler: INFO: Final stage: ResultStage 65 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:32.087 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 64) 2023-04-22 21:12:32.087 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 64) 2023-04-22 21:12:32.090 DAGScheduler: INFO: Submitting ShuffleMapStage 64 (MapPartitionsRDD[140] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:32.126 MemoryStore: INFO: Block broadcast_118 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:32.132 MemoryStore: INFO: Block broadcast_118_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:32.132 BlockManagerInfo: INFO: Added broadcast_118_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:32.132 SparkContext: INFO: Created broadcast 118 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:32.133 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 64 (MapPartitionsRDD[140] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:32.133 TaskSchedulerImpl: INFO: Adding task set 64.0 with 8 tasks resource profile 0 2023-04-22 21:12:32.133 TaskSetManager: INFO: Starting task 0.0 in stage 64.0 (TID 326) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:32.134 Executor: INFO: Running task 0.0 in stage 64.0 (TID 326) 2023-04-22 21:12:32.160 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:32.297 Executor: INFO: Finished task 0.0 in stage 64.0 (TID 326). 1197 bytes result sent to driver 2023-04-22 21:12:32.297 TaskSetManager: INFO: Starting task 1.0 in stage 64.0 (TID 327) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:32.297 TaskSetManager: INFO: Finished task 0.0 in stage 64.0 (TID 326) in 164 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:32.298 Executor: INFO: Running task 1.0 in stage 64.0 (TID 327) 2023-04-22 21:12:32.323 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:32.450 Executor: INFO: Finished task 1.0 in stage 64.0 (TID 327). 1197 bytes result sent to driver 2023-04-22 21:12:32.450 TaskSetManager: INFO: Starting task 2.0 in stage 64.0 (TID 328) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:32.451 TaskSetManager: INFO: Finished task 1.0 in stage 64.0 (TID 327) in 154 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:32.451 Executor: INFO: Running task 2.0 in stage 64.0 (TID 328) 2023-04-22 21:12:32.476 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:32.598 Executor: INFO: Finished task 2.0 in stage 64.0 (TID 328). 1197 bytes result sent to driver 2023-04-22 21:12:32.599 TaskSetManager: INFO: Starting task 3.0 in stage 64.0 (TID 329) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:32.599 TaskSetManager: INFO: Finished task 2.0 in stage 64.0 (TID 328) in 149 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:32.600 Executor: INFO: Running task 3.0 in stage 64.0 (TID 329) 2023-04-22 21:12:32.626 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:32.749 Executor: INFO: Finished task 3.0 in stage 64.0 (TID 329). 1197 bytes result sent to driver 2023-04-22 21:12:32.750 TaskSetManager: INFO: Starting task 4.0 in stage 64.0 (TID 330) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:32.750 TaskSetManager: INFO: Finished task 3.0 in stage 64.0 (TID 329) in 152 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:32.750 Executor: INFO: Running task 4.0 in stage 64.0 (TID 330) 2023-04-22 21:12:32.776 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:32.916 Executor: INFO: Finished task 4.0 in stage 64.0 (TID 330). 1197 bytes result sent to driver 2023-04-22 21:12:32.916 TaskSetManager: INFO: Starting task 5.0 in stage 64.0 (TID 331) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:32.917 TaskSetManager: INFO: Finished task 4.0 in stage 64.0 (TID 330) in 167 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:32.917 Executor: INFO: Running task 5.0 in stage 64.0 (TID 331) 2023-04-22 21:12:32.942 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:33.067 Executor: INFO: Finished task 5.0 in stage 64.0 (TID 331). 1197 bytes result sent to driver 2023-04-22 21:12:33.067 TaskSetManager: INFO: Starting task 6.0 in stage 64.0 (TID 332) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:33.068 TaskSetManager: INFO: Finished task 5.0 in stage 64.0 (TID 331) in 151 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:33.068 Executor: INFO: Running task 6.0 in stage 64.0 (TID 332) 2023-04-22 21:12:33.093 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:33.228 Executor: INFO: Finished task 6.0 in stage 64.0 (TID 332). 1197 bytes result sent to driver 2023-04-22 21:12:33.229 TaskSetManager: INFO: Starting task 7.0 in stage 64.0 (TID 333) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:33.229 TaskSetManager: INFO: Finished task 6.0 in stage 64.0 (TID 332) in 162 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:33.230 Executor: INFO: Running task 7.0 in stage 64.0 (TID 333) 2023-04-22 21:12:33.256 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:33.386 Executor: INFO: Finished task 7.0 in stage 64.0 (TID 333). 1197 bytes result sent to driver 2023-04-22 21:12:33.390 TaskSetManager: INFO: Finished task 7.0 in stage 64.0 (TID 333) in 161 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:33.390 TaskSchedulerImpl: INFO: Removed TaskSet 64.0, whose tasks have all completed, from pool 2023-04-22 21:12:33.390 DAGScheduler: INFO: ShuffleMapStage 64 (treeAggregate at RowMatrix.scala:94) finished in 1.299 s 2023-04-22 21:12:33.390 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:33.390 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:33.390 DAGScheduler: INFO: waiting: Set(ResultStage 65) 2023-04-22 21:12:33.390 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:33.390 DAGScheduler: INFO: Submitting ResultStage 65 (MapPartitionsRDD[142] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:33.431 MemoryStore: INFO: Block broadcast_119 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:33.437 MemoryStore: INFO: Block broadcast_119_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:33.437 BlockManagerInfo: INFO: Added broadcast_119_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:33.438 SparkContext: INFO: Created broadcast 119 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:33.438 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 65 (MapPartitionsRDD[142] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:33.438 TaskSchedulerImpl: INFO: Adding task set 65.0 with 2 tasks resource profile 0 2023-04-22 21:12:33.439 TaskSetManager: INFO: Starting task 0.0 in stage 65.0 (TID 334) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:33.439 Executor: INFO: Running task 0.0 in stage 65.0 (TID 334) 2023-04-22 21:12:33.464 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:33.464 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:33.470 Executor: INFO: Finished task 0.0 in stage 65.0 (TID 334). 34646 bytes result sent to driver 2023-04-22 21:12:33.472 TaskSetManager: INFO: Starting task 1.0 in stage 65.0 (TID 335) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:33.472 TaskSetManager: INFO: Finished task 0.0 in stage 65.0 (TID 334) in 33 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:33.473 Executor: INFO: Running task 1.0 in stage 65.0 (TID 335) 2023-04-22 21:12:33.498 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:33.498 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:33.502 Executor: INFO: Finished task 1.0 in stage 65.0 (TID 335). 34646 bytes result sent to driver 2023-04-22 21:12:33.505 TaskSetManager: INFO: Finished task 1.0 in stage 65.0 (TID 335) in 33 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:33.505 TaskSchedulerImpl: INFO: Removed TaskSet 65.0, whose tasks have all completed, from pool 2023-04-22 21:12:33.505 DAGScheduler: INFO: ResultStage 65 (treeAggregate at RowMatrix.scala:94) finished in 0.114 s 2023-04-22 21:12:33.506 DAGScheduler: INFO: Job 33 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:33.506 TaskSchedulerImpl: INFO: Killing all running tasks in stage 65: Stage finished 2023-04-22 21:12:33.507 DAGScheduler: INFO: Job 33 finished: treeAggregate at RowMatrix.scala:94, took 1.421419 s 2023-04-22 21:12:33.509 MemoryStore: INFO: Block broadcast_120 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:33.510 MemoryStore: INFO: Block broadcast_120_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:33.510 BlockManagerInfo: INFO: Added broadcast_120_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:33.511 SparkContext: INFO: Created broadcast 120 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:33.573 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:33.573 DAGScheduler: INFO: Registering RDD 144 (treeAggregate at RowMatrix.scala:94) as input to shuffle 32 2023-04-22 21:12:33.574 DAGScheduler: INFO: Got job 34 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:33.574 DAGScheduler: INFO: Final stage: ResultStage 67 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:33.574 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 66) 2023-04-22 21:12:33.574 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 66) 2023-04-22 21:12:33.577 DAGScheduler: INFO: Submitting ShuffleMapStage 66 (MapPartitionsRDD[144] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:33.612 MemoryStore: INFO: Block broadcast_121 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:33.618 MemoryStore: INFO: Block broadcast_121_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:33.619 BlockManagerInfo: INFO: Added broadcast_121_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:33.620 SparkContext: INFO: Created broadcast 121 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:33.620 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 66 (MapPartitionsRDD[144] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:33.620 TaskSchedulerImpl: INFO: Adding task set 66.0 with 8 tasks resource profile 0 2023-04-22 21:12:33.621 TaskSetManager: INFO: Starting task 0.0 in stage 66.0 (TID 336) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:33.621 Executor: INFO: Running task 0.0 in stage 66.0 (TID 336) 2023-04-22 21:12:33.647 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:33.770 Executor: INFO: Finished task 0.0 in stage 66.0 (TID 336). 1197 bytes result sent to driver 2023-04-22 21:12:33.770 TaskSetManager: INFO: Starting task 1.0 in stage 66.0 (TID 337) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:33.771 TaskSetManager: INFO: Finished task 0.0 in stage 66.0 (TID 336) in 150 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:33.771 Executor: INFO: Running task 1.0 in stage 66.0 (TID 337) 2023-04-22 21:12:33.797 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:33.923 Executor: INFO: Finished task 1.0 in stage 66.0 (TID 337). 1197 bytes result sent to driver 2023-04-22 21:12:33.923 TaskSetManager: INFO: Starting task 2.0 in stage 66.0 (TID 338) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:33.924 TaskSetManager: INFO: Finished task 1.0 in stage 66.0 (TID 337) in 154 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:33.924 Executor: INFO: Running task 2.0 in stage 66.0 (TID 338) 2023-04-22 21:12:33.949 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:34.077 Executor: INFO: Finished task 2.0 in stage 66.0 (TID 338). 1197 bytes result sent to driver 2023-04-22 21:12:34.077 TaskSetManager: INFO: Starting task 3.0 in stage 66.0 (TID 339) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:34.078 TaskSetManager: INFO: Finished task 2.0 in stage 66.0 (TID 338) in 155 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:34.078 Executor: INFO: Running task 3.0 in stage 66.0 (TID 339) 2023-04-22 21:12:34.103 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:34.247 Executor: INFO: Finished task 3.0 in stage 66.0 (TID 339). 1197 bytes result sent to driver 2023-04-22 21:12:34.247 TaskSetManager: INFO: Starting task 4.0 in stage 66.0 (TID 340) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:34.247 TaskSetManager: INFO: Finished task 3.0 in stage 66.0 (TID 339) in 170 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:34.248 Executor: INFO: Running task 4.0 in stage 66.0 (TID 340) 2023-04-22 21:12:34.272 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:34.401 Executor: INFO: Finished task 4.0 in stage 66.0 (TID 340). 1197 bytes result sent to driver 2023-04-22 21:12:34.401 TaskSetManager: INFO: Starting task 5.0 in stage 66.0 (TID 341) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:34.402 TaskSetManager: INFO: Finished task 4.0 in stage 66.0 (TID 340) in 155 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:34.404 Executor: INFO: Running task 5.0 in stage 66.0 (TID 341) 2023-04-22 21:12:34.429 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:34.554 Executor: INFO: Finished task 5.0 in stage 66.0 (TID 341). 1197 bytes result sent to driver 2023-04-22 21:12:34.558 TaskSetManager: INFO: Starting task 6.0 in stage 66.0 (TID 342) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:34.558 TaskSetManager: INFO: Finished task 5.0 in stage 66.0 (TID 341) in 157 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:34.558 Executor: INFO: Running task 6.0 in stage 66.0 (TID 342) 2023-04-22 21:12:34.583 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:34.710 Executor: INFO: Finished task 6.0 in stage 66.0 (TID 342). 1197 bytes result sent to driver 2023-04-22 21:12:34.711 TaskSetManager: INFO: Starting task 7.0 in stage 66.0 (TID 343) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:34.711 TaskSetManager: INFO: Finished task 6.0 in stage 66.0 (TID 342) in 154 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:34.711 Executor: INFO: Running task 7.0 in stage 66.0 (TID 343) 2023-04-22 21:12:34.736 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:34.865 Executor: INFO: Finished task 7.0 in stage 66.0 (TID 343). 1197 bytes result sent to driver 2023-04-22 21:12:34.865 TaskSetManager: INFO: Finished task 7.0 in stage 66.0 (TID 343) in 155 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:34.866 TaskSchedulerImpl: INFO: Removed TaskSet 66.0, whose tasks have all completed, from pool 2023-04-22 21:12:34.866 DAGScheduler: INFO: ShuffleMapStage 66 (treeAggregate at RowMatrix.scala:94) finished in 1.289 s 2023-04-22 21:12:34.866 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:34.866 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:34.866 DAGScheduler: INFO: waiting: Set(ResultStage 67) 2023-04-22 21:12:34.866 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:34.866 DAGScheduler: INFO: Submitting ResultStage 67 (MapPartitionsRDD[146] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:34.906 MemoryStore: INFO: Block broadcast_122 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:34.925 MemoryStore: INFO: Block broadcast_122_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:34.926 BlockManagerInfo: INFO: Added broadcast_122_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:34.926 SparkContext: INFO: Created broadcast 122 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:34.926 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 67 (MapPartitionsRDD[146] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:34.926 TaskSchedulerImpl: INFO: Adding task set 67.0 with 2 tasks resource profile 0 2023-04-22 21:12:34.927 TaskSetManager: INFO: Starting task 0.0 in stage 67.0 (TID 344) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:34.927 Executor: INFO: Running task 0.0 in stage 67.0 (TID 344) 2023-04-22 21:12:34.975 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:34.975 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:34.979 Executor: INFO: Finished task 0.0 in stage 67.0 (TID 344). 34646 bytes result sent to driver 2023-04-22 21:12:34.980 TaskSetManager: INFO: Starting task 1.0 in stage 67.0 (TID 345) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:34.980 TaskSetManager: INFO: Finished task 0.0 in stage 67.0 (TID 344) in 53 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:34.982 Executor: INFO: Running task 1.0 in stage 67.0 (TID 345) 2023-04-22 21:12:35.026 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:35.026 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:35.030 Executor: INFO: Finished task 1.0 in stage 67.0 (TID 345). 34646 bytes result sent to driver 2023-04-22 21:12:35.030 TaskSetManager: INFO: Finished task 1.0 in stage 67.0 (TID 345) in 50 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:35.030 TaskSchedulerImpl: INFO: Removed TaskSet 67.0, whose tasks have all completed, from pool 2023-04-22 21:12:35.031 DAGScheduler: INFO: ResultStage 67 (treeAggregate at RowMatrix.scala:94) finished in 0.164 s 2023-04-22 21:12:35.031 DAGScheduler: INFO: Job 34 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:35.031 TaskSchedulerImpl: INFO: Killing all running tasks in stage 67: Stage finished 2023-04-22 21:12:35.031 DAGScheduler: INFO: Job 34 finished: treeAggregate at RowMatrix.scala:94, took 1.458263 s 2023-04-22 21:12:35.033 MemoryStore: INFO: Block broadcast_123 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:35.034 MemoryStore: INFO: Block broadcast_123_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:35.040 BlockManagerInfo: INFO: Added broadcast_123_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:35.040 SparkContext: INFO: Created broadcast 123 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:35.128 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:35.128 DAGScheduler: INFO: Registering RDD 148 (treeAggregate at RowMatrix.scala:94) as input to shuffle 33 2023-04-22 21:12:35.128 DAGScheduler: INFO: Got job 35 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:35.128 DAGScheduler: INFO: Final stage: ResultStage 69 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:35.128 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 68) 2023-04-22 21:12:35.129 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 68) 2023-04-22 21:12:35.130 DAGScheduler: INFO: Submitting ShuffleMapStage 68 (MapPartitionsRDD[148] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:35.164 MemoryStore: INFO: Block broadcast_124 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:35.174 MemoryStore: INFO: Block broadcast_124_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:35.174 BlockManagerInfo: INFO: Added broadcast_124_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:35.174 SparkContext: INFO: Created broadcast 124 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:35.174 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 68 (MapPartitionsRDD[148] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:35.174 TaskSchedulerImpl: INFO: Adding task set 68.0 with 8 tasks resource profile 0 2023-04-22 21:12:35.175 TaskSetManager: INFO: Starting task 0.0 in stage 68.0 (TID 346) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:35.176 Executor: INFO: Running task 0.0 in stage 68.0 (TID 346) 2023-04-22 21:12:35.201 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:35.338 Executor: INFO: Finished task 0.0 in stage 68.0 (TID 346). 1197 bytes result sent to driver 2023-04-22 21:12:35.339 TaskSetManager: INFO: Starting task 1.0 in stage 68.0 (TID 347) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:35.339 TaskSetManager: INFO: Finished task 0.0 in stage 68.0 (TID 346) in 164 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:35.339 Executor: INFO: Running task 1.0 in stage 68.0 (TID 347) 2023-04-22 21:12:35.364 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:35.492 Executor: INFO: Finished task 1.0 in stage 68.0 (TID 347). 1197 bytes result sent to driver 2023-04-22 21:12:35.492 TaskSetManager: INFO: Starting task 2.0 in stage 68.0 (TID 348) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:35.492 TaskSetManager: INFO: Finished task 1.0 in stage 68.0 (TID 347) in 154 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:35.493 Executor: INFO: Running task 2.0 in stage 68.0 (TID 348) 2023-04-22 21:12:35.518 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:35.646 Executor: INFO: Finished task 2.0 in stage 68.0 (TID 348). 1197 bytes result sent to driver 2023-04-22 21:12:35.646 TaskSetManager: INFO: Starting task 3.0 in stage 68.0 (TID 349) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:35.646 TaskSetManager: INFO: Finished task 2.0 in stage 68.0 (TID 348) in 154 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:35.662 Executor: INFO: Running task 3.0 in stage 68.0 (TID 349) 2023-04-22 21:12:35.688 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:35.816 Executor: INFO: Finished task 3.0 in stage 68.0 (TID 349). 1197 bytes result sent to driver 2023-04-22 21:12:35.816 TaskSetManager: INFO: Starting task 4.0 in stage 68.0 (TID 350) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:35.816 TaskSetManager: INFO: Finished task 3.0 in stage 68.0 (TID 349) in 170 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:35.818 Executor: INFO: Running task 4.0 in stage 68.0 (TID 350) 2023-04-22 21:12:35.843 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:35.972 Executor: INFO: Finished task 4.0 in stage 68.0 (TID 350). 1197 bytes result sent to driver 2023-04-22 21:12:35.972 TaskSetManager: INFO: Starting task 5.0 in stage 68.0 (TID 351) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:35.972 TaskSetManager: INFO: Finished task 4.0 in stage 68.0 (TID 350) in 156 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:35.973 Executor: INFO: Running task 5.0 in stage 68.0 (TID 351) 2023-04-22 21:12:35.998 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:36.122 Executor: INFO: Finished task 5.0 in stage 68.0 (TID 351). 1197 bytes result sent to driver 2023-04-22 21:12:36.122 TaskSetManager: INFO: Starting task 6.0 in stage 68.0 (TID 352) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:36.122 TaskSetManager: INFO: Finished task 5.0 in stage 68.0 (TID 351) in 150 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:36.122 Executor: INFO: Running task 6.0 in stage 68.0 (TID 352) 2023-04-22 21:12:36.147 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:36.288 Executor: INFO: Finished task 6.0 in stage 68.0 (TID 352). 1197 bytes result sent to driver 2023-04-22 21:12:36.289 TaskSetManager: INFO: Starting task 7.0 in stage 68.0 (TID 353) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:36.289 TaskSetManager: INFO: Finished task 6.0 in stage 68.0 (TID 352) in 167 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:36.289 Executor: INFO: Running task 7.0 in stage 68.0 (TID 353) 2023-04-22 21:12:36.314 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:36.435 Executor: INFO: Finished task 7.0 in stage 68.0 (TID 353). 1197 bytes result sent to driver 2023-04-22 21:12:36.437 TaskSetManager: INFO: Finished task 7.0 in stage 68.0 (TID 353) in 149 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:36.437 TaskSchedulerImpl: INFO: Removed TaskSet 68.0, whose tasks have all completed, from pool 2023-04-22 21:12:36.437 DAGScheduler: INFO: ShuffleMapStage 68 (treeAggregate at RowMatrix.scala:94) finished in 1.307 s 2023-04-22 21:12:36.437 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:36.437 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:36.437 DAGScheduler: INFO: waiting: Set(ResultStage 69) 2023-04-22 21:12:36.437 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:36.438 DAGScheduler: INFO: Submitting ResultStage 69 (MapPartitionsRDD[150] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:36.477 MemoryStore: INFO: Block broadcast_125 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:36.484 MemoryStore: INFO: Block broadcast_125_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:36.495 BlockManagerInfo: INFO: Added broadcast_125_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:36.499 SparkContext: INFO: Created broadcast 125 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:36.500 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 69 (MapPartitionsRDD[150] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:36.500 TaskSchedulerImpl: INFO: Adding task set 69.0 with 2 tasks resource profile 0 2023-04-22 21:12:36.501 TaskSetManager: INFO: Starting task 0.0 in stage 69.0 (TID 354) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:36.501 Executor: INFO: Running task 0.0 in stage 69.0 (TID 354) 2023-04-22 21:12:36.527 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:36.527 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:36.531 Executor: INFO: Finished task 0.0 in stage 69.0 (TID 354). 34646 bytes result sent to driver 2023-04-22 21:12:36.532 TaskSetManager: INFO: Starting task 1.0 in stage 69.0 (TID 355) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:36.532 TaskSetManager: INFO: Finished task 0.0 in stage 69.0 (TID 354) in 31 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:36.532 Executor: INFO: Running task 1.0 in stage 69.0 (TID 355) 2023-04-22 21:12:36.557 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:36.557 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:36.561 Executor: INFO: Finished task 1.0 in stage 69.0 (TID 355). 34646 bytes result sent to driver 2023-04-22 21:12:36.574 TaskSetManager: INFO: Finished task 1.0 in stage 69.0 (TID 355) in 43 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:36.574 TaskSchedulerImpl: INFO: Removed TaskSet 69.0, whose tasks have all completed, from pool 2023-04-22 21:12:36.574 DAGScheduler: INFO: ResultStage 69 (treeAggregate at RowMatrix.scala:94) finished in 0.136 s 2023-04-22 21:12:36.574 DAGScheduler: INFO: Job 35 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:36.574 TaskSchedulerImpl: INFO: Killing all running tasks in stage 69: Stage finished 2023-04-22 21:12:36.575 DAGScheduler: INFO: Job 35 finished: treeAggregate at RowMatrix.scala:94, took 1.447162 s 2023-04-22 21:12:36.576 MemoryStore: INFO: Block broadcast_126 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:36.578 MemoryStore: INFO: Block broadcast_126_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:36.578 BlockManagerInfo: INFO: Added broadcast_126_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:36.580 SparkContext: INFO: Created broadcast 126 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:36.696 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:36.696 DAGScheduler: INFO: Registering RDD 152 (treeAggregate at RowMatrix.scala:94) as input to shuffle 34 2023-04-22 21:12:36.696 DAGScheduler: INFO: Got job 36 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:36.696 DAGScheduler: INFO: Final stage: ResultStage 71 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:36.696 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 70) 2023-04-22 21:12:36.697 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 70) 2023-04-22 21:12:36.698 DAGScheduler: INFO: Submitting ShuffleMapStage 70 (MapPartitionsRDD[152] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:36.749 MemoryStore: INFO: Block broadcast_127 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:36.758 MemoryStore: INFO: Block broadcast_127_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:36.758 BlockManagerInfo: INFO: Added broadcast_127_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:36.759 SparkContext: INFO: Created broadcast 127 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:36.759 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 70 (MapPartitionsRDD[152] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:36.759 TaskSchedulerImpl: INFO: Adding task set 70.0 with 8 tasks resource profile 0 2023-04-22 21:12:36.772 TaskSetManager: INFO: Starting task 0.0 in stage 70.0 (TID 356) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:36.772 Executor: INFO: Running task 0.0 in stage 70.0 (TID 356) 2023-04-22 21:12:36.820 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:36.957 Executor: INFO: Finished task 0.0 in stage 70.0 (TID 356). 1197 bytes result sent to driver 2023-04-22 21:12:36.959 TaskSetManager: INFO: Starting task 1.0 in stage 70.0 (TID 357) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:36.959 TaskSetManager: INFO: Finished task 0.0 in stage 70.0 (TID 356) in 187 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:36.960 Executor: INFO: Running task 1.0 in stage 70.0 (TID 357) 2023-04-22 21:12:36.986 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:37.111 Executor: INFO: Finished task 1.0 in stage 70.0 (TID 357). 1197 bytes result sent to driver 2023-04-22 21:12:37.111 TaskSetManager: INFO: Starting task 2.0 in stage 70.0 (TID 358) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:37.111 TaskSetManager: INFO: Finished task 1.0 in stage 70.0 (TID 357) in 152 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:37.112 Executor: INFO: Running task 2.0 in stage 70.0 (TID 358) 2023-04-22 21:12:37.137 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:37.259 Executor: INFO: Finished task 2.0 in stage 70.0 (TID 358). 1197 bytes result sent to driver 2023-04-22 21:12:37.260 TaskSetManager: INFO: Starting task 3.0 in stage 70.0 (TID 359) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:37.260 TaskSetManager: INFO: Finished task 2.0 in stage 70.0 (TID 358) in 149 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:37.260 Executor: INFO: Running task 3.0 in stage 70.0 (TID 359) 2023-04-22 21:12:37.296 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:37.418 Executor: INFO: Finished task 3.0 in stage 70.0 (TID 359). 1197 bytes result sent to driver 2023-04-22 21:12:37.418 TaskSetManager: INFO: Starting task 4.0 in stage 70.0 (TID 360) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:37.419 TaskSetManager: INFO: Finished task 3.0 in stage 70.0 (TID 359) in 160 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:37.419 Executor: INFO: Running task 4.0 in stage 70.0 (TID 360) 2023-04-22 21:12:37.444 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:37.573 Executor: INFO: Finished task 4.0 in stage 70.0 (TID 360). 1197 bytes result sent to driver 2023-04-22 21:12:37.573 TaskSetManager: INFO: Starting task 5.0 in stage 70.0 (TID 361) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:37.573 TaskSetManager: INFO: Finished task 4.0 in stage 70.0 (TID 360) in 155 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:37.574 Executor: INFO: Running task 5.0 in stage 70.0 (TID 361) 2023-04-22 21:12:37.599 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:37.724 Executor: INFO: Finished task 5.0 in stage 70.0 (TID 361). 1197 bytes result sent to driver 2023-04-22 21:12:37.725 TaskSetManager: INFO: Starting task 6.0 in stage 70.0 (TID 362) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:37.725 TaskSetManager: INFO: Finished task 5.0 in stage 70.0 (TID 361) in 152 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:37.725 Executor: INFO: Running task 6.0 in stage 70.0 (TID 362) 2023-04-22 21:12:37.750 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:37.872 Executor: INFO: Finished task 6.0 in stage 70.0 (TID 362). 1197 bytes result sent to driver 2023-04-22 21:12:37.872 TaskSetManager: INFO: Starting task 7.0 in stage 70.0 (TID 363) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:37.873 TaskSetManager: INFO: Finished task 6.0 in stage 70.0 (TID 362) in 148 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:37.873 Executor: INFO: Running task 7.0 in stage 70.0 (TID 363) 2023-04-22 21:12:37.898 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:38.025 Executor: INFO: Finished task 7.0 in stage 70.0 (TID 363). 1197 bytes result sent to driver 2023-04-22 21:12:38.026 TaskSetManager: INFO: Finished task 7.0 in stage 70.0 (TID 363) in 154 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:38.026 TaskSchedulerImpl: INFO: Removed TaskSet 70.0, whose tasks have all completed, from pool 2023-04-22 21:12:38.026 DAGScheduler: INFO: ShuffleMapStage 70 (treeAggregate at RowMatrix.scala:94) finished in 1.327 s 2023-04-22 21:12:38.026 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:38.026 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:38.026 DAGScheduler: INFO: waiting: Set(ResultStage 71) 2023-04-22 21:12:38.026 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:38.026 DAGScheduler: INFO: Submitting ResultStage 71 (MapPartitionsRDD[154] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:38.067 MemoryStore: INFO: Block broadcast_128 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:38.073 MemoryStore: INFO: Block broadcast_128_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:38.073 BlockManagerInfo: INFO: Added broadcast_128_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.074 SparkContext: INFO: Created broadcast 128 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:38.074 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 71 (MapPartitionsRDD[154] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:38.074 TaskSchedulerImpl: INFO: Adding task set 71.0 with 2 tasks resource profile 0 2023-04-22 21:12:38.075 TaskSetManager: INFO: Starting task 0.0 in stage 71.0 (TID 364) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:38.076 Executor: INFO: Running task 0.0 in stage 71.0 (TID 364) 2023-04-22 21:12:38.101 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:38.101 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:38.105 Executor: INFO: Finished task 0.0 in stage 71.0 (TID 364). 34646 bytes result sent to driver 2023-04-22 21:12:38.106 TaskSetManager: INFO: Starting task 1.0 in stage 71.0 (TID 365) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:38.107 TaskSetManager: INFO: Finished task 0.0 in stage 71.0 (TID 364) in 33 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:38.110 Executor: INFO: Running task 1.0 in stage 71.0 (TID 365) 2023-04-22 21:12:38.135 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:38.135 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:38.148 Executor: INFO: Finished task 1.0 in stage 71.0 (TID 365). 34646 bytes result sent to driver 2023-04-22 21:12:38.149 TaskSetManager: INFO: Finished task 1.0 in stage 71.0 (TID 365) in 43 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:38.149 TaskSchedulerImpl: INFO: Removed TaskSet 71.0, whose tasks have all completed, from pool 2023-04-22 21:12:38.149 DAGScheduler: INFO: ResultStage 71 (treeAggregate at RowMatrix.scala:94) finished in 0.122 s 2023-04-22 21:12:38.149 DAGScheduler: INFO: Job 36 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:38.149 TaskSchedulerImpl: INFO: Killing all running tasks in stage 71: Stage finished 2023-04-22 21:12:38.150 DAGScheduler: INFO: Job 36 finished: treeAggregate at RowMatrix.scala:94, took 1.454004 s 2023-04-22 21:12:38.154 MemoryStore: INFO: Block broadcast_129 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:38.155 MemoryStore: INFO: Block broadcast_129_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:38.156 BlockManagerInfo: INFO: Added broadcast_129_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.156 SparkContext: INFO: Created broadcast 129 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:38.296 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:38.306 DAGScheduler: INFO: Registering RDD 156 (treeAggregate at RowMatrix.scala:94) as input to shuffle 35 2023-04-22 21:12:38.306 DAGScheduler: INFO: Got job 37 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:38.306 DAGScheduler: INFO: Final stage: ResultStage 73 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:38.306 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 72) 2023-04-22 21:12:38.306 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 72) 2023-04-22 21:12:38.319 BlockManagerInfo: INFO: Removed broadcast_111_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.320 BlockManagerInfo: INFO: Removed broadcast_126_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.320 DAGScheduler: INFO: Submitting ShuffleMapStage 72 (MapPartitionsRDD[156] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:38.331 BlockManagerInfo: INFO: Removed broadcast_115_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.332 BlockManagerInfo: INFO: Removed broadcast_122_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.333 BlockManagerInfo: INFO: Removed broadcast_119_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.336 BlockManagerInfo: INFO: Removed broadcast_114_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.337 BlockManagerInfo: INFO: Removed broadcast_112_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.339 BlockManagerInfo: INFO: Removed broadcast_116_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.340 BlockManagerInfo: INFO: Removed broadcast_123_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.340 BlockManagerInfo: INFO: Removed broadcast_128_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.341 BlockManagerInfo: INFO: Removed broadcast_120_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.346 BlockManagerInfo: INFO: Removed broadcast_127_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.347 BlockManagerInfo: INFO: Removed broadcast_124_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.347 BlockManagerInfo: INFO: Removed broadcast_125_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.349 BlockManagerInfo: INFO: Removed broadcast_121_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.350 BlockManagerInfo: INFO: Removed broadcast_118_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.352 BlockManagerInfo: INFO: Removed broadcast_117_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.353 BlockManagerInfo: INFO: Removed broadcast_113_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.387 MemoryStore: INFO: Block broadcast_130 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:38.393 MemoryStore: INFO: Block broadcast_130_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:38.396 BlockManagerInfo: INFO: Added broadcast_130_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:38.397 SparkContext: INFO: Created broadcast 130 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:38.397 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 72 (MapPartitionsRDD[156] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:38.397 TaskSchedulerImpl: INFO: Adding task set 72.0 with 8 tasks resource profile 0 2023-04-22 21:12:38.398 TaskSetManager: INFO: Starting task 0.0 in stage 72.0 (TID 366) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:38.398 Executor: INFO: Running task 0.0 in stage 72.0 (TID 366) 2023-04-22 21:12:38.424 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:38.547 Executor: INFO: Finished task 0.0 in stage 72.0 (TID 366). 1197 bytes result sent to driver 2023-04-22 21:12:38.548 TaskSetManager: INFO: Starting task 1.0 in stage 72.0 (TID 367) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:38.548 TaskSetManager: INFO: Finished task 0.0 in stage 72.0 (TID 366) in 150 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:38.548 Executor: INFO: Running task 1.0 in stage 72.0 (TID 367) 2023-04-22 21:12:38.574 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:38.699 Executor: INFO: Finished task 1.0 in stage 72.0 (TID 367). 1197 bytes result sent to driver 2023-04-22 21:12:38.699 TaskSetManager: INFO: Starting task 2.0 in stage 72.0 (TID 368) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:38.699 TaskSetManager: INFO: Finished task 1.0 in stage 72.0 (TID 367) in 152 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:38.700 Executor: INFO: Running task 2.0 in stage 72.0 (TID 368) 2023-04-22 21:12:38.725 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:38.853 Executor: INFO: Finished task 2.0 in stage 72.0 (TID 368). 1197 bytes result sent to driver 2023-04-22 21:12:38.853 TaskSetManager: INFO: Starting task 3.0 in stage 72.0 (TID 369) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:38.853 TaskSetManager: INFO: Finished task 2.0 in stage 72.0 (TID 368) in 154 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:38.854 Executor: INFO: Running task 3.0 in stage 72.0 (TID 369) 2023-04-22 21:12:38.879 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:39.001 Executor: INFO: Finished task 3.0 in stage 72.0 (TID 369). 1197 bytes result sent to driver 2023-04-22 21:12:39.001 TaskSetManager: INFO: Starting task 4.0 in stage 72.0 (TID 370) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:39.002 TaskSetManager: INFO: Finished task 3.0 in stage 72.0 (TID 369) in 149 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:39.002 Executor: INFO: Running task 4.0 in stage 72.0 (TID 370) 2023-04-22 21:12:39.027 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:39.151 Executor: INFO: Finished task 4.0 in stage 72.0 (TID 370). 1197 bytes result sent to driver 2023-04-22 21:12:39.151 TaskSetManager: INFO: Starting task 5.0 in stage 72.0 (TID 371) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:39.151 TaskSetManager: INFO: Finished task 4.0 in stage 72.0 (TID 370) in 150 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:39.152 Executor: INFO: Running task 5.0 in stage 72.0 (TID 371) 2023-04-22 21:12:39.177 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:39.314 Executor: INFO: Finished task 5.0 in stage 72.0 (TID 371). 1197 bytes result sent to driver 2023-04-22 21:12:39.315 TaskSetManager: INFO: Starting task 6.0 in stage 72.0 (TID 372) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:39.315 TaskSetManager: INFO: Finished task 5.0 in stage 72.0 (TID 371) in 164 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:39.315 Executor: INFO: Running task 6.0 in stage 72.0 (TID 372) 2023-04-22 21:12:39.341 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:39.464 Executor: INFO: Finished task 6.0 in stage 72.0 (TID 372). 1197 bytes result sent to driver 2023-04-22 21:12:39.464 TaskSetManager: INFO: Starting task 7.0 in stage 72.0 (TID 373) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:39.464 TaskSetManager: INFO: Finished task 6.0 in stage 72.0 (TID 372) in 149 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:39.465 Executor: INFO: Running task 7.0 in stage 72.0 (TID 373) 2023-04-22 21:12:39.490 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:39.617 Executor: INFO: Finished task 7.0 in stage 72.0 (TID 373). 1197 bytes result sent to driver 2023-04-22 21:12:39.617 TaskSetManager: INFO: Finished task 7.0 in stage 72.0 (TID 373) in 153 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:39.617 TaskSchedulerImpl: INFO: Removed TaskSet 72.0, whose tasks have all completed, from pool 2023-04-22 21:12:39.618 DAGScheduler: INFO: ShuffleMapStage 72 (treeAggregate at RowMatrix.scala:94) finished in 1.297 s 2023-04-22 21:12:39.618 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:39.618 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:39.618 DAGScheduler: INFO: waiting: Set(ResultStage 73) 2023-04-22 21:12:39.618 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:39.618 DAGScheduler: INFO: Submitting ResultStage 73 (MapPartitionsRDD[158] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:39.658 MemoryStore: INFO: Block broadcast_131 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:39.664 MemoryStore: INFO: Block broadcast_131_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:39.664 BlockManagerInfo: INFO: Added broadcast_131_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:39.666 SparkContext: INFO: Created broadcast 131 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:39.666 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 73 (MapPartitionsRDD[158] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:39.666 TaskSchedulerImpl: INFO: Adding task set 73.0 with 2 tasks resource profile 0 2023-04-22 21:12:39.667 TaskSetManager: INFO: Starting task 0.0 in stage 73.0 (TID 374) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:39.667 Executor: INFO: Running task 0.0 in stage 73.0 (TID 374) 2023-04-22 21:12:39.692 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:39.693 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:39.697 Executor: INFO: Finished task 0.0 in stage 73.0 (TID 374). 34646 bytes result sent to driver 2023-04-22 21:12:39.698 TaskSetManager: INFO: Starting task 1.0 in stage 73.0 (TID 375) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:39.699 TaskSetManager: INFO: Finished task 0.0 in stage 73.0 (TID 374) in 33 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:39.699 Executor: INFO: Running task 1.0 in stage 73.0 (TID 375) 2023-04-22 21:12:39.724 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:39.724 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:39.728 Executor: INFO: Finished task 1.0 in stage 73.0 (TID 375). 34646 bytes result sent to driver 2023-04-22 21:12:39.728 TaskSetManager: INFO: Finished task 1.0 in stage 73.0 (TID 375) in 30 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:39.728 TaskSchedulerImpl: INFO: Removed TaskSet 73.0, whose tasks have all completed, from pool 2023-04-22 21:12:39.729 DAGScheduler: INFO: ResultStage 73 (treeAggregate at RowMatrix.scala:94) finished in 0.111 s 2023-04-22 21:12:39.729 DAGScheduler: INFO: Job 37 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:39.729 TaskSchedulerImpl: INFO: Killing all running tasks in stage 73: Stage finished 2023-04-22 21:12:39.729 DAGScheduler: INFO: Job 37 finished: treeAggregate at RowMatrix.scala:94, took 1.432938 s 2023-04-22 21:12:39.731 MemoryStore: INFO: Block broadcast_132 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:39.732 MemoryStore: INFO: Block broadcast_132_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:39.733 BlockManagerInfo: INFO: Added broadcast_132_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:39.734 SparkContext: INFO: Created broadcast 132 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:39.794 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:39.795 DAGScheduler: INFO: Registering RDD 160 (treeAggregate at RowMatrix.scala:94) as input to shuffle 36 2023-04-22 21:12:39.795 DAGScheduler: INFO: Got job 38 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:39.795 DAGScheduler: INFO: Final stage: ResultStage 75 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:39.795 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 74) 2023-04-22 21:12:39.795 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 74) 2023-04-22 21:12:39.797 DAGScheduler: INFO: Submitting ShuffleMapStage 74 (MapPartitionsRDD[160] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:39.832 MemoryStore: INFO: Block broadcast_133 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:39.838 MemoryStore: INFO: Block broadcast_133_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:39.840 BlockManagerInfo: INFO: Added broadcast_133_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:39.843 SparkContext: INFO: Created broadcast 133 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:39.843 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 74 (MapPartitionsRDD[160] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:39.843 TaskSchedulerImpl: INFO: Adding task set 74.0 with 8 tasks resource profile 0 2023-04-22 21:12:39.844 TaskSetManager: INFO: Starting task 0.0 in stage 74.0 (TID 376) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:39.844 Executor: INFO: Running task 0.0 in stage 74.0 (TID 376) 2023-04-22 21:12:39.870 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:39.994 Executor: INFO: Finished task 0.0 in stage 74.0 (TID 376). 1197 bytes result sent to driver 2023-04-22 21:12:39.994 TaskSetManager: INFO: Starting task 1.0 in stage 74.0 (TID 377) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:39.994 TaskSetManager: INFO: Finished task 0.0 in stage 74.0 (TID 376) in 150 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:39.995 Executor: INFO: Running task 1.0 in stage 74.0 (TID 377) 2023-04-22 21:12:40.020 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:40.144 Executor: INFO: Finished task 1.0 in stage 74.0 (TID 377). 1197 bytes result sent to driver 2023-04-22 21:12:40.145 TaskSetManager: INFO: Starting task 2.0 in stage 74.0 (TID 378) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:40.145 TaskSetManager: INFO: Finished task 1.0 in stage 74.0 (TID 377) in 151 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:40.147 Executor: INFO: Running task 2.0 in stage 74.0 (TID 378) 2023-04-22 21:12:40.176 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:40.304 Executor: INFO: Finished task 2.0 in stage 74.0 (TID 378). 1197 bytes result sent to driver 2023-04-22 21:12:40.304 TaskSetManager: INFO: Starting task 3.0 in stage 74.0 (TID 379) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:40.304 TaskSetManager: INFO: Finished task 2.0 in stage 74.0 (TID 378) in 159 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:40.305 Executor: INFO: Running task 3.0 in stage 74.0 (TID 379) 2023-04-22 21:12:40.345 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:40.467 Executor: INFO: Finished task 3.0 in stage 74.0 (TID 379). 1197 bytes result sent to driver 2023-04-22 21:12:40.467 TaskSetManager: INFO: Starting task 4.0 in stage 74.0 (TID 380) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:40.467 TaskSetManager: INFO: Finished task 3.0 in stage 74.0 (TID 379) in 163 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:40.468 Executor: INFO: Running task 4.0 in stage 74.0 (TID 380) 2023-04-22 21:12:40.493 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:40.621 Executor: INFO: Finished task 4.0 in stage 74.0 (TID 380). 1197 bytes result sent to driver 2023-04-22 21:12:40.622 TaskSetManager: INFO: Starting task 5.0 in stage 74.0 (TID 381) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:40.622 TaskSetManager: INFO: Finished task 4.0 in stage 74.0 (TID 380) in 155 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:40.624 Executor: INFO: Running task 5.0 in stage 74.0 (TID 381) 2023-04-22 21:12:40.648 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:40.772 Executor: INFO: Finished task 5.0 in stage 74.0 (TID 381). 1197 bytes result sent to driver 2023-04-22 21:12:40.773 TaskSetManager: INFO: Starting task 6.0 in stage 74.0 (TID 382) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:40.776 TaskSetManager: INFO: Finished task 5.0 in stage 74.0 (TID 381) in 154 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:40.776 Executor: INFO: Running task 6.0 in stage 74.0 (TID 382) 2023-04-22 21:12:40.805 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:40.927 Executor: INFO: Finished task 6.0 in stage 74.0 (TID 382). 1197 bytes result sent to driver 2023-04-22 21:12:40.927 TaskSetManager: INFO: Starting task 7.0 in stage 74.0 (TID 383) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:40.935 TaskSetManager: INFO: Finished task 6.0 in stage 74.0 (TID 382) in 163 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:40.939 Executor: INFO: Running task 7.0 in stage 74.0 (TID 383) 2023-04-22 21:12:40.967 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:41.088 Executor: INFO: Finished task 7.0 in stage 74.0 (TID 383). 1197 bytes result sent to driver 2023-04-22 21:12:41.089 TaskSetManager: INFO: Finished task 7.0 in stage 74.0 (TID 383) in 162 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:41.090 TaskSchedulerImpl: INFO: Removed TaskSet 74.0, whose tasks have all completed, from pool 2023-04-22 21:12:41.090 DAGScheduler: INFO: ShuffleMapStage 74 (treeAggregate at RowMatrix.scala:94) finished in 1.293 s 2023-04-22 21:12:41.090 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:41.090 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:41.090 DAGScheduler: INFO: waiting: Set(ResultStage 75) 2023-04-22 21:12:41.090 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:41.090 DAGScheduler: INFO: Submitting ResultStage 75 (MapPartitionsRDD[162] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:41.137 MemoryStore: INFO: Block broadcast_134 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:41.143 MemoryStore: INFO: Block broadcast_134_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:41.144 BlockManagerInfo: INFO: Added broadcast_134_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:41.144 SparkContext: INFO: Created broadcast 134 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:41.144 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 75 (MapPartitionsRDD[162] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:41.144 TaskSchedulerImpl: INFO: Adding task set 75.0 with 2 tasks resource profile 0 2023-04-22 21:12:41.145 TaskSetManager: INFO: Starting task 0.0 in stage 75.0 (TID 384) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:41.145 Executor: INFO: Running task 0.0 in stage 75.0 (TID 384) 2023-04-22 21:12:41.171 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:41.171 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 1 ms 2023-04-22 21:12:41.176 Executor: INFO: Finished task 0.0 in stage 75.0 (TID 384). 34646 bytes result sent to driver 2023-04-22 21:12:41.176 TaskSetManager: INFO: Starting task 1.0 in stage 75.0 (TID 385) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:41.179 TaskSetManager: INFO: Finished task 0.0 in stage 75.0 (TID 384) in 34 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:41.180 Executor: INFO: Running task 1.0 in stage 75.0 (TID 385) 2023-04-22 21:12:41.205 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:41.205 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:41.209 Executor: INFO: Finished task 1.0 in stage 75.0 (TID 385). 34646 bytes result sent to driver 2023-04-22 21:12:41.211 TaskSetManager: INFO: Finished task 1.0 in stage 75.0 (TID 385) in 34 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:41.211 TaskSchedulerImpl: INFO: Removed TaskSet 75.0, whose tasks have all completed, from pool 2023-04-22 21:12:41.211 DAGScheduler: INFO: ResultStage 75 (treeAggregate at RowMatrix.scala:94) finished in 0.120 s 2023-04-22 21:12:41.211 DAGScheduler: INFO: Job 38 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:41.211 TaskSchedulerImpl: INFO: Killing all running tasks in stage 75: Stage finished 2023-04-22 21:12:41.211 DAGScheduler: INFO: Job 38 finished: treeAggregate at RowMatrix.scala:94, took 1.417124 s 2023-04-22 21:12:41.213 MemoryStore: INFO: Block broadcast_135 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:41.214 MemoryStore: INFO: Block broadcast_135_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:41.214 BlockManagerInfo: INFO: Added broadcast_135_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:41.215 SparkContext: INFO: Created broadcast 135 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:41.280 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:41.281 DAGScheduler: INFO: Registering RDD 164 (treeAggregate at RowMatrix.scala:94) as input to shuffle 37 2023-04-22 21:12:41.281 DAGScheduler: INFO: Got job 39 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:41.281 DAGScheduler: INFO: Final stage: ResultStage 77 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:41.281 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 76) 2023-04-22 21:12:41.281 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 76) 2023-04-22 21:12:41.282 DAGScheduler: INFO: Submitting ShuffleMapStage 76 (MapPartitionsRDD[164] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:41.317 MemoryStore: INFO: Block broadcast_136 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:41.337 MemoryStore: INFO: Block broadcast_136_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:41.337 BlockManagerInfo: INFO: Added broadcast_136_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:41.337 SparkContext: INFO: Created broadcast 136 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:41.337 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 76 (MapPartitionsRDD[164] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:41.337 TaskSchedulerImpl: INFO: Adding task set 76.0 with 8 tasks resource profile 0 2023-04-22 21:12:41.338 TaskSetManager: INFO: Starting task 0.0 in stage 76.0 (TID 386) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:41.338 Executor: INFO: Running task 0.0 in stage 76.0 (TID 386) 2023-04-22 21:12:41.363 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:41.486 Executor: INFO: Finished task 0.0 in stage 76.0 (TID 386). 1197 bytes result sent to driver 2023-04-22 21:12:41.486 TaskSetManager: INFO: Starting task 1.0 in stage 76.0 (TID 387) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:41.486 TaskSetManager: INFO: Finished task 0.0 in stage 76.0 (TID 386) in 148 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:41.487 Executor: INFO: Running task 1.0 in stage 76.0 (TID 387) 2023-04-22 21:12:41.526 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:41.650 Executor: INFO: Finished task 1.0 in stage 76.0 (TID 387). 1197 bytes result sent to driver 2023-04-22 21:12:41.650 TaskSetManager: INFO: Starting task 2.0 in stage 76.0 (TID 388) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:41.661 TaskSetManager: INFO: Finished task 1.0 in stage 76.0 (TID 387) in 175 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:41.662 Executor: INFO: Running task 2.0 in stage 76.0 (TID 388) 2023-04-22 21:12:41.687 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:41.809 Executor: INFO: Finished task 2.0 in stage 76.0 (TID 388). 1197 bytes result sent to driver 2023-04-22 21:12:41.809 TaskSetManager: INFO: Starting task 3.0 in stage 76.0 (TID 389) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:41.811 TaskSetManager: INFO: Finished task 2.0 in stage 76.0 (TID 388) in 161 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:41.811 Executor: INFO: Running task 3.0 in stage 76.0 (TID 389) 2023-04-22 21:12:41.836 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:41.958 Executor: INFO: Finished task 3.0 in stage 76.0 (TID 389). 1197 bytes result sent to driver 2023-04-22 21:12:41.959 TaskSetManager: INFO: Starting task 4.0 in stage 76.0 (TID 390) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:41.959 TaskSetManager: INFO: Finished task 3.0 in stage 76.0 (TID 389) in 150 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:41.959 Executor: INFO: Running task 4.0 in stage 76.0 (TID 390) 2023-04-22 21:12:41.988 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:42.118 Executor: INFO: Finished task 4.0 in stage 76.0 (TID 390). 1197 bytes result sent to driver 2023-04-22 21:12:42.118 TaskSetManager: INFO: Starting task 5.0 in stage 76.0 (TID 391) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:42.120 TaskSetManager: INFO: Finished task 4.0 in stage 76.0 (TID 390) in 161 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:42.120 Executor: INFO: Running task 5.0 in stage 76.0 (TID 391) 2023-04-22 21:12:42.152 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:42.276 Executor: INFO: Finished task 5.0 in stage 76.0 (TID 391). 1197 bytes result sent to driver 2023-04-22 21:12:42.276 TaskSetManager: INFO: Starting task 6.0 in stage 76.0 (TID 392) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:42.277 TaskSetManager: INFO: Finished task 5.0 in stage 76.0 (TID 391) in 159 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:42.277 Executor: INFO: Running task 6.0 in stage 76.0 (TID 392) 2023-04-22 21:12:42.302 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:42.423 Executor: INFO: Finished task 6.0 in stage 76.0 (TID 392). 1197 bytes result sent to driver 2023-04-22 21:12:42.423 TaskSetManager: INFO: Starting task 7.0 in stage 76.0 (TID 393) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:42.424 TaskSetManager: INFO: Finished task 6.0 in stage 76.0 (TID 392) in 147 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:42.424 Executor: INFO: Running task 7.0 in stage 76.0 (TID 393) 2023-04-22 21:12:42.449 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:42.584 Executor: INFO: Finished task 7.0 in stage 76.0 (TID 393). 1197 bytes result sent to driver 2023-04-22 21:12:42.584 TaskSetManager: INFO: Finished task 7.0 in stage 76.0 (TID 393) in 161 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:42.584 TaskSchedulerImpl: INFO: Removed TaskSet 76.0, whose tasks have all completed, from pool 2023-04-22 21:12:42.584 DAGScheduler: INFO: ShuffleMapStage 76 (treeAggregate at RowMatrix.scala:94) finished in 1.301 s 2023-04-22 21:12:42.584 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:42.584 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:42.584 DAGScheduler: INFO: waiting: Set(ResultStage 77) 2023-04-22 21:12:42.584 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:42.585 DAGScheduler: INFO: Submitting ResultStage 77 (MapPartitionsRDD[166] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:42.624 MemoryStore: INFO: Block broadcast_137 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:42.630 MemoryStore: INFO: Block broadcast_137_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:42.631 BlockManagerInfo: INFO: Added broadcast_137_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:42.631 SparkContext: INFO: Created broadcast 137 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:42.631 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 77 (MapPartitionsRDD[166] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:42.631 TaskSchedulerImpl: INFO: Adding task set 77.0 with 2 tasks resource profile 0 2023-04-22 21:12:42.632 TaskSetManager: INFO: Starting task 0.0 in stage 77.0 (TID 394) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:42.632 Executor: INFO: Running task 0.0 in stage 77.0 (TID 394) 2023-04-22 21:12:42.658 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:42.658 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:42.662 Executor: INFO: Finished task 0.0 in stage 77.0 (TID 394). 34646 bytes result sent to driver 2023-04-22 21:12:42.663 TaskSetManager: INFO: Starting task 1.0 in stage 77.0 (TID 395) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:42.664 TaskSetManager: INFO: Finished task 0.0 in stage 77.0 (TID 394) in 32 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:42.664 Executor: INFO: Running task 1.0 in stage 77.0 (TID 395) 2023-04-22 21:12:42.689 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:42.689 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:42.695 Executor: INFO: Finished task 1.0 in stage 77.0 (TID 395). 34646 bytes result sent to driver 2023-04-22 21:12:42.696 TaskSetManager: INFO: Finished task 1.0 in stage 77.0 (TID 395) in 33 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:42.696 TaskSchedulerImpl: INFO: Removed TaskSet 77.0, whose tasks have all completed, from pool 2023-04-22 21:12:42.696 DAGScheduler: INFO: ResultStage 77 (treeAggregate at RowMatrix.scala:94) finished in 0.111 s 2023-04-22 21:12:42.696 DAGScheduler: INFO: Job 39 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:42.696 TaskSchedulerImpl: INFO: Killing all running tasks in stage 77: Stage finished 2023-04-22 21:12:42.697 DAGScheduler: INFO: Job 39 finished: treeAggregate at RowMatrix.scala:94, took 1.416429 s 2023-04-22 21:12:42.698 MemoryStore: INFO: Block broadcast_138 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:42.699 MemoryStore: INFO: Block broadcast_138_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:42.700 BlockManagerInfo: INFO: Added broadcast_138_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:42.710 SparkContext: INFO: Created broadcast 138 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:42.829 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:42.831 DAGScheduler: INFO: Registering RDD 168 (treeAggregate at RowMatrix.scala:94) as input to shuffle 38 2023-04-22 21:12:42.831 DAGScheduler: INFO: Got job 40 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:42.831 DAGScheduler: INFO: Final stage: ResultStage 79 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:42.831 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 78) 2023-04-22 21:12:42.831 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 78) 2023-04-22 21:12:42.832 DAGScheduler: INFO: Submitting ShuffleMapStage 78 (MapPartitionsRDD[168] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:42.901 MemoryStore: INFO: Block broadcast_139 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:42.907 MemoryStore: INFO: Block broadcast_139_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:42.907 BlockManagerInfo: INFO: Added broadcast_139_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:42.908 SparkContext: INFO: Created broadcast 139 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:42.908 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 78 (MapPartitionsRDD[168] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:42.908 TaskSchedulerImpl: INFO: Adding task set 78.0 with 8 tasks resource profile 0 2023-04-22 21:12:42.909 TaskSetManager: INFO: Starting task 0.0 in stage 78.0 (TID 396) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:42.909 Executor: INFO: Running task 0.0 in stage 78.0 (TID 396) 2023-04-22 21:12:42.943 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:43.101 Executor: INFO: Finished task 0.0 in stage 78.0 (TID 396). 1197 bytes result sent to driver 2023-04-22 21:12:43.101 TaskSetManager: INFO: Starting task 1.0 in stage 78.0 (TID 397) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:43.101 TaskSetManager: INFO: Finished task 0.0 in stage 78.0 (TID 396) in 192 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:43.102 Executor: INFO: Running task 1.0 in stage 78.0 (TID 397) 2023-04-22 21:12:43.127 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:43.253 Executor: INFO: Finished task 1.0 in stage 78.0 (TID 397). 1197 bytes result sent to driver 2023-04-22 21:12:43.254 TaskSetManager: INFO: Starting task 2.0 in stage 78.0 (TID 398) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:43.254 TaskSetManager: INFO: Finished task 1.0 in stage 78.0 (TID 397) in 153 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:43.254 Executor: INFO: Running task 2.0 in stage 78.0 (TID 398) 2023-04-22 21:12:43.279 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:43.412 Executor: INFO: Finished task 2.0 in stage 78.0 (TID 398). 1197 bytes result sent to driver 2023-04-22 21:12:43.413 TaskSetManager: INFO: Starting task 3.0 in stage 78.0 (TID 399) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:43.413 TaskSetManager: INFO: Finished task 2.0 in stage 78.0 (TID 398) in 160 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:43.429 Executor: INFO: Running task 3.0 in stage 78.0 (TID 399) 2023-04-22 21:12:43.476 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:43.631 Executor: INFO: Finished task 3.0 in stage 78.0 (TID 399). 1197 bytes result sent to driver 2023-04-22 21:12:43.631 TaskSetManager: INFO: Starting task 4.0 in stage 78.0 (TID 400) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:43.631 TaskSetManager: INFO: Finished task 3.0 in stage 78.0 (TID 399) in 218 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:43.634 Executor: INFO: Running task 4.0 in stage 78.0 (TID 400) 2023-04-22 21:12:43.659 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:43.788 Executor: INFO: Finished task 4.0 in stage 78.0 (TID 400). 1197 bytes result sent to driver 2023-04-22 21:12:43.788 TaskSetManager: INFO: Starting task 5.0 in stage 78.0 (TID 401) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:43.788 TaskSetManager: INFO: Finished task 4.0 in stage 78.0 (TID 400) in 157 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:43.789 Executor: INFO: Running task 5.0 in stage 78.0 (TID 401) 2023-04-22 21:12:43.816 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:43.940 Executor: INFO: Finished task 5.0 in stage 78.0 (TID 401). 1197 bytes result sent to driver 2023-04-22 21:12:43.940 TaskSetManager: INFO: Starting task 6.0 in stage 78.0 (TID 402) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:43.940 TaskSetManager: INFO: Finished task 5.0 in stage 78.0 (TID 401) in 152 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:43.941 Executor: INFO: Running task 6.0 in stage 78.0 (TID 402) 2023-04-22 21:12:43.975 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:44.102 Executor: INFO: Finished task 6.0 in stage 78.0 (TID 402). 1197 bytes result sent to driver 2023-04-22 21:12:44.102 TaskSetManager: INFO: Starting task 7.0 in stage 78.0 (TID 403) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:44.103 TaskSetManager: INFO: Finished task 6.0 in stage 78.0 (TID 402) in 163 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:44.104 Executor: INFO: Running task 7.0 in stage 78.0 (TID 403) 2023-04-22 21:12:44.130 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:44.257 Executor: INFO: Finished task 7.0 in stage 78.0 (TID 403). 1197 bytes result sent to driver 2023-04-22 21:12:44.257 TaskSetManager: INFO: Finished task 7.0 in stage 78.0 (TID 403) in 155 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:44.257 TaskSchedulerImpl: INFO: Removed TaskSet 78.0, whose tasks have all completed, from pool 2023-04-22 21:12:44.257 DAGScheduler: INFO: ShuffleMapStage 78 (treeAggregate at RowMatrix.scala:94) finished in 1.424 s 2023-04-22 21:12:44.257 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:44.258 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:44.258 DAGScheduler: INFO: waiting: Set(ResultStage 79) 2023-04-22 21:12:44.258 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:44.258 DAGScheduler: INFO: Submitting ResultStage 79 (MapPartitionsRDD[170] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:44.298 MemoryStore: INFO: Block broadcast_140 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:44.304 MemoryStore: INFO: Block broadcast_140_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:44.305 BlockManagerInfo: INFO: Added broadcast_140_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:44.305 SparkContext: INFO: Created broadcast 140 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:44.305 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 79 (MapPartitionsRDD[170] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:44.305 TaskSchedulerImpl: INFO: Adding task set 79.0 with 2 tasks resource profile 0 2023-04-22 21:12:44.306 TaskSetManager: INFO: Starting task 0.0 in stage 79.0 (TID 404) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:44.306 Executor: INFO: Running task 0.0 in stage 79.0 (TID 404) 2023-04-22 21:12:44.331 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:44.331 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:44.336 Executor: INFO: Finished task 0.0 in stage 79.0 (TID 404). 34646 bytes result sent to driver 2023-04-22 21:12:44.336 TaskSetManager: INFO: Starting task 1.0 in stage 79.0 (TID 405) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:44.337 TaskSetManager: INFO: Finished task 0.0 in stage 79.0 (TID 404) in 31 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:44.338 Executor: INFO: Running task 1.0 in stage 79.0 (TID 405) 2023-04-22 21:12:44.363 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:44.363 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:44.367 Executor: INFO: Finished task 1.0 in stage 79.0 (TID 405). 34646 bytes result sent to driver 2023-04-22 21:12:44.369 TaskSetManager: INFO: Finished task 1.0 in stage 79.0 (TID 405) in 33 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:44.369 TaskSchedulerImpl: INFO: Removed TaskSet 79.0, whose tasks have all completed, from pool 2023-04-22 21:12:44.370 DAGScheduler: INFO: ResultStage 79 (treeAggregate at RowMatrix.scala:94) finished in 0.111 s 2023-04-22 21:12:44.370 DAGScheduler: INFO: Job 40 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:44.370 TaskSchedulerImpl: INFO: Killing all running tasks in stage 79: Stage finished 2023-04-22 21:12:44.370 DAGScheduler: INFO: Job 40 finished: treeAggregate at RowMatrix.scala:94, took 1.541197 s 2023-04-22 21:12:44.371 MemoryStore: INFO: Block broadcast_141 stored as values in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:44.373 MemoryStore: INFO: Block broadcast_141_piece0 stored as bytes in memory (estimated size 97.4 KiB, free 25.2 GiB) 2023-04-22 21:12:44.375 BlockManagerInfo: INFO: Added broadcast_141_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:44.375 SparkContext: INFO: Created broadcast 141 from broadcast at RowMatrix.scala:93 2023-04-22 21:12:44.435 SparkContext: INFO: Starting job: treeAggregate at RowMatrix.scala:94 2023-04-22 21:12:44.441 DAGScheduler: INFO: Registering RDD 172 (treeAggregate at RowMatrix.scala:94) as input to shuffle 39 2023-04-22 21:12:44.441 DAGScheduler: INFO: Got job 41 (treeAggregate at RowMatrix.scala:94) with 2 output partitions 2023-04-22 21:12:44.441 DAGScheduler: INFO: Final stage: ResultStage 81 (treeAggregate at RowMatrix.scala:94) 2023-04-22 21:12:44.441 DAGScheduler: INFO: Parents of final stage: List(ShuffleMapStage 80) 2023-04-22 21:12:44.441 DAGScheduler: INFO: Missing parents: List(ShuffleMapStage 80) 2023-04-22 21:12:44.442 DAGScheduler: INFO: Submitting ShuffleMapStage 80 (MapPartitionsRDD[172] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:44.479 MemoryStore: INFO: Block broadcast_142 stored as values in memory (estimated size 868.9 KiB, free 25.2 GiB) 2023-04-22 21:12:44.485 MemoryStore: INFO: Block broadcast_142_piece0 stored as bytes in memory (estimated size 421.2 KiB, free 25.2 GiB) 2023-04-22 21:12:44.486 BlockManagerInfo: INFO: Added broadcast_142_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:44.486 SparkContext: INFO: Created broadcast 142 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:44.486 DAGScheduler: INFO: Submitting 8 missing tasks from ShuffleMapStage 80 (MapPartitionsRDD[172] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:44.487 TaskSchedulerImpl: INFO: Adding task set 80.0 with 8 tasks resource profile 0 2023-04-22 21:12:44.487 TaskSetManager: INFO: Starting task 0.0 in stage 80.0 (TID 406) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4494 bytes) taskResourceAssignments Map() 2023-04-22 21:12:44.491 Executor: INFO: Running task 0.0 in stage 80.0 (TID 406) 2023-04-22 21:12:44.516 BlockManager: INFO: Found block rdd_12_0 locally 2023-04-22 21:12:44.653 Executor: INFO: Finished task 0.0 in stage 80.0 (TID 406). 1197 bytes result sent to driver 2023-04-22 21:12:44.653 TaskSetManager: INFO: Starting task 1.0 in stage 80.0 (TID 407) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:44.657 TaskSetManager: INFO: Finished task 0.0 in stage 80.0 (TID 406) in 170 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:44.658 Executor: INFO: Running task 1.0 in stage 80.0 (TID 407) 2023-04-22 21:12:44.683 BlockManager: INFO: Found block rdd_12_1 locally 2023-04-22 21:12:44.810 Executor: INFO: Finished task 1.0 in stage 80.0 (TID 407). 1197 bytes result sent to driver 2023-04-22 21:12:44.810 TaskSetManager: INFO: Starting task 2.0 in stage 80.0 (TID 408) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:44.810 TaskSetManager: INFO: Finished task 1.0 in stage 80.0 (TID 407) in 157 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:44.811 Executor: INFO: Running task 2.0 in stage 80.0 (TID 408) 2023-04-22 21:12:44.836 BlockManager: INFO: Found block rdd_12_2 locally 2023-04-22 21:12:44.963 Executor: INFO: Finished task 2.0 in stage 80.0 (TID 408). 1197 bytes result sent to driver 2023-04-22 21:12:44.963 TaskSetManager: INFO: Starting task 3.0 in stage 80.0 (TID 409) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:44.964 TaskSetManager: INFO: Finished task 2.0 in stage 80.0 (TID 408) in 154 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:12:44.964 Executor: INFO: Running task 3.0 in stage 80.0 (TID 409) 2023-04-22 21:12:44.989 BlockManager: INFO: Found block rdd_12_3 locally 2023-04-22 21:12:45.118 Executor: INFO: Finished task 3.0 in stage 80.0 (TID 409). 1197 bytes result sent to driver 2023-04-22 21:12:45.118 TaskSetManager: INFO: Starting task 4.0 in stage 80.0 (TID 410) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:45.118 TaskSetManager: INFO: Finished task 3.0 in stage 80.0 (TID 409) in 155 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:12:45.119 Executor: INFO: Running task 4.0 in stage 80.0 (TID 410) 2023-04-22 21:12:45.144 BlockManager: INFO: Found block rdd_12_4 locally 2023-04-22 21:12:45.273 Executor: INFO: Finished task 4.0 in stage 80.0 (TID 410). 1197 bytes result sent to driver 2023-04-22 21:12:45.273 TaskSetManager: INFO: Starting task 5.0 in stage 80.0 (TID 411) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:45.273 TaskSetManager: INFO: Finished task 4.0 in stage 80.0 (TID 410) in 155 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:12:45.274 Executor: INFO: Running task 5.0 in stage 80.0 (TID 411) 2023-04-22 21:12:45.299 BlockManager: INFO: Found block rdd_12_5 locally 2023-04-22 21:12:45.424 Executor: INFO: Finished task 5.0 in stage 80.0 (TID 411). 1197 bytes result sent to driver 2023-04-22 21:12:45.424 TaskSetManager: INFO: Starting task 6.0 in stage 80.0 (TID 412) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:45.424 TaskSetManager: INFO: Finished task 5.0 in stage 80.0 (TID 411) in 151 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:12:45.425 Executor: INFO: Running task 6.0 in stage 80.0 (TID 412) 2023-04-22 21:12:45.450 BlockManager: INFO: Found block rdd_12_6 locally 2023-04-22 21:12:45.585 Executor: INFO: Finished task 6.0 in stage 80.0 (TID 412). 1197 bytes result sent to driver 2023-04-22 21:12:45.585 TaskSetManager: INFO: Starting task 7.0 in stage 80.0 (TID 413) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:12:45.585 TaskSetManager: INFO: Finished task 6.0 in stage 80.0 (TID 412) in 161 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:12:45.603 Executor: INFO: Running task 7.0 in stage 80.0 (TID 413) 2023-04-22 21:12:45.630 BlockManager: INFO: Found block rdd_12_7 locally 2023-04-22 21:12:45.762 Executor: INFO: Finished task 7.0 in stage 80.0 (TID 413). 1197 bytes result sent to driver 2023-04-22 21:12:45.762 TaskSetManager: INFO: Finished task 7.0 in stage 80.0 (TID 413) in 177 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:12:45.762 TaskSchedulerImpl: INFO: Removed TaskSet 80.0, whose tasks have all completed, from pool 2023-04-22 21:12:45.762 DAGScheduler: INFO: ShuffleMapStage 80 (treeAggregate at RowMatrix.scala:94) finished in 1.319 s 2023-04-22 21:12:45.762 DAGScheduler: INFO: looking for newly runnable stages 2023-04-22 21:12:45.762 DAGScheduler: INFO: running: Set() 2023-04-22 21:12:45.762 DAGScheduler: INFO: waiting: Set(ResultStage 81) 2023-04-22 21:12:45.762 DAGScheduler: INFO: failed: Set() 2023-04-22 21:12:45.763 DAGScheduler: INFO: Submitting ResultStage 81 (MapPartitionsRDD[174] at treeAggregate at RowMatrix.scala:94), which has no missing parents 2023-04-22 21:12:45.804 MemoryStore: INFO: Block broadcast_143 stored as values in memory (estimated size 869.9 KiB, free 25.2 GiB) 2023-04-22 21:12:45.810 MemoryStore: INFO: Block broadcast_143_piece0 stored as bytes in memory (estimated size 422.6 KiB, free 25.2 GiB) 2023-04-22 21:12:45.811 BlockManagerInfo: INFO: Added broadcast_143_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:12:45.811 SparkContext: INFO: Created broadcast 143 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:45.811 DAGScheduler: INFO: Submitting 2 missing tasks from ResultStage 81 (MapPartitionsRDD[174] at treeAggregate at RowMatrix.scala:94) (first 15 tasks are for partitions Vector(0, 1)) 2023-04-22 21:12:45.811 TaskSchedulerImpl: INFO: Adding task set 81.0 with 2 tasks resource profile 0 2023-04-22 21:12:45.812 TaskSetManager: INFO: Starting task 0.0 in stage 81.0 (TID 414) (uger-c010.broadinstitute.org, executor driver, partition 0, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:45.812 Executor: INFO: Running task 0.0 in stage 81.0 (TID 414) 2023-04-22 21:12:45.838 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:45.838 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:45.846 Executor: INFO: Finished task 0.0 in stage 81.0 (TID 414). 34646 bytes result sent to driver 2023-04-22 21:12:45.846 TaskSetManager: INFO: Starting task 1.0 in stage 81.0 (TID 415) (uger-c010.broadinstitute.org, executor driver, partition 1, NODE_LOCAL, 4271 bytes) taskResourceAssignments Map() 2023-04-22 21:12:45.847 TaskSetManager: INFO: Finished task 0.0 in stage 81.0 (TID 414) in 35 ms on uger-c010.broadinstitute.org (executor driver) (1/2) 2023-04-22 21:12:45.847 Executor: INFO: Running task 1.0 in stage 81.0 (TID 415) 2023-04-22 21:12:45.874 ShuffleBlockFetcherIterator: INFO: Getting 4 (139.6 KiB) non-empty blocks including 4 (139.6 KiB) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 2023-04-22 21:12:45.874 ShuffleBlockFetcherIterator: INFO: Started 0 remote fetches in 0 ms 2023-04-22 21:12:45.878 Executor: INFO: Finished task 1.0 in stage 81.0 (TID 415). 34646 bytes result sent to driver 2023-04-22 21:12:45.883 TaskSetManager: INFO: Finished task 1.0 in stage 81.0 (TID 415) in 37 ms on uger-c010.broadinstitute.org (executor driver) (2/2) 2023-04-22 21:12:45.884 TaskSchedulerImpl: INFO: Removed TaskSet 81.0, whose tasks have all completed, from pool 2023-04-22 21:12:45.884 DAGScheduler: INFO: ResultStage 81 (treeAggregate at RowMatrix.scala:94) finished in 0.121 s 2023-04-22 21:12:45.884 DAGScheduler: INFO: Job 41 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:12:45.884 TaskSchedulerImpl: INFO: Killing all running tasks in stage 81: Stage finished 2023-04-22 21:12:45.884 DAGScheduler: INFO: Job 41 finished: treeAggregate at RowMatrix.scala:94, took 1.449180 s 2023-04-22 21:12:46.215 RowMatrix: WARN: The input data was not directly cached, which may hurt performance if its parent RDDs are also uncached. 2023-04-22 21:12:46.498 SparkContext: INFO: Starting job: collect at PCA.scala:51 2023-04-22 21:12:46.498 DAGScheduler: INFO: Got job 42 (collect at PCA.scala:51) with 8 output partitions 2023-04-22 21:12:46.498 DAGScheduler: INFO: Final stage: ResultStage 82 (collect at PCA.scala:51) 2023-04-22 21:12:46.498 DAGScheduler: INFO: Parents of final stage: List() 2023-04-22 21:12:46.498 DAGScheduler: INFO: Missing parents: List() 2023-04-22 21:12:46.498 DAGScheduler: INFO: Submitting ResultStage 82 (MapPartitionsRDD[178] at map at PCA.scala:51), which has no missing parents 2023-04-22 21:12:46.533 MemoryStore: INFO: Block broadcast_144 stored as values in memory (estimated size 865.1 KiB, free 25.2 GiB) 2023-04-22 21:12:46.539 MemoryStore: INFO: Block broadcast_144_piece0 stored as bytes in memory (estimated size 419.4 KiB, free 25.2 GiB) 2023-04-22 21:12:46.539 BlockManagerInfo: INFO: Added broadcast_144_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 419.4 KiB, free: 25.3 GiB) 2023-04-22 21:12:46.540 SparkContext: INFO: Created broadcast 144 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:12:46.540 DAGScheduler: INFO: Submitting 8 missing tasks from ResultStage 82 (MapPartitionsRDD[178] at map at PCA.scala:51) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:12:46.540 TaskSchedulerImpl: INFO: Adding task set 82.0 with 8 tasks resource profile 0 2023-04-22 21:12:46.541 TaskSetManager: INFO: Starting task 0.0 in stage 82.0 (TID 416) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4505 bytes) taskResourceAssignments Map() 2023-04-22 21:12:46.541 Executor: INFO: Running task 0.0 in stage 82.0 (TID 416) 2023-04-22 21:12:46.567 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 0.0 in stage 82.0 (TID 416) 2023-04-22 21:12:46.569 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 61: Executor task launch worker for task 0.0 in stage 82.0 (TID 416) 2023-04-22 21:12:53.080 : INFO: TaskReport: stage=82, partition=0, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:12:53.080 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 0.0 in stage 82.0 (TID 416) 2023-04-22 21:12:53.277 Executor: INFO: Finished task 0.0 in stage 82.0 (TID 416). 480472 bytes result sent to driver 2023-04-22 21:12:53.277 TaskSetManager: INFO: Starting task 1.0 in stage 82.0 (TID 417) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:12:53.311 Executor: INFO: Running task 1.0 in stage 82.0 (TID 417) 2023-04-22 21:12:53.432 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 1.0 in stage 82.0 (TID 417) 2023-04-22 21:12:53.444 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 61: Executor task launch worker for task 1.0 in stage 82.0 (TID 417) 2023-04-22 21:12:53.496 TaskSetManager: INFO: Finished task 0.0 in stage 82.0 (TID 416) in 6956 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:12:59.537 : INFO: TaskReport: stage=82, partition=1, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:12:59.537 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 1.0 in stage 82.0 (TID 417) 2023-04-22 21:12:59.582 Executor: INFO: Finished task 1.0 in stage 82.0 (TID 417). 481263 bytes result sent to driver 2023-04-22 21:12:59.582 TaskSetManager: INFO: Starting task 2.0 in stage 82.0 (TID 418) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:12:59.627 Executor: INFO: Running task 2.0 in stage 82.0 (TID 418) 2023-04-22 21:12:59.687 TaskSetManager: INFO: Finished task 1.0 in stage 82.0 (TID 417) in 6410 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:12:59.833 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 2.0 in stage 82.0 (TID 418) 2023-04-22 21:12:59.834 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 61: Executor task launch worker for task 2.0 in stage 82.0 (TID 418) 2023-04-22 21:12:59.880 BlockManagerInfo: INFO: Removed broadcast_136_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:12:59.984 BlockManagerInfo: INFO: Removed broadcast_133_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.104 BlockManagerInfo: INFO: Removed broadcast_141_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.291 BlockManagerInfo: INFO: Removed broadcast_131_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.397 BlockManagerInfo: INFO: Removed broadcast_140_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.469 BlockManagerInfo: INFO: Removed broadcast_139_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.543 BlockManagerInfo: INFO: Removed broadcast_138_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.610 BlockManagerInfo: INFO: Removed broadcast_135_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.611 BlockManagerInfo: INFO: Removed broadcast_129_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.612 BlockManagerInfo: INFO: Removed broadcast_142_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.614 BlockManagerInfo: INFO: Removed broadcast_130_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.615 BlockManagerInfo: INFO: Removed broadcast_132_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.617 BlockManagerInfo: INFO: Removed broadcast_137_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.617 BlockManagerInfo: INFO: Removed broadcast_134_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:13:00.618 BlockManagerInfo: INFO: Removed broadcast_143_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 422.6 KiB, free: 25.3 GiB) 2023-04-22 21:13:05.968 : INFO: TaskReport: stage=82, partition=2, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:13:05.968 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 2.0 in stage 82.0 (TID 418) 2023-04-22 21:13:06.052 Executor: INFO: Finished task 2.0 in stage 82.0 (TID 418). 481149 bytes result sent to driver 2023-04-22 21:13:06.052 TaskSetManager: INFO: Starting task 3.0 in stage 82.0 (TID 419) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:13:06.070 TaskSetManager: INFO: Finished task 2.0 in stage 82.0 (TID 418) in 6488 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:13:06.070 Executor: INFO: Running task 3.0 in stage 82.0 (TID 419) 2023-04-22 21:13:06.109 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 3.0 in stage 82.0 (TID 419) 2023-04-22 21:13:06.111 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 61: Executor task launch worker for task 3.0 in stage 82.0 (TID 419) 2023-04-22 21:13:11.890 : INFO: TaskReport: stage=82, partition=3, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:13:11.890 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 3.0 in stage 82.0 (TID 419) 2023-04-22 21:13:11.945 Executor: INFO: Finished task 3.0 in stage 82.0 (TID 419). 477655 bytes result sent to driver 2023-04-22 21:13:11.945 TaskSetManager: INFO: Starting task 4.0 in stage 82.0 (TID 420) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:13:11.973 TaskSetManager: INFO: Finished task 3.0 in stage 82.0 (TID 419) in 5921 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:13:11.982 Executor: INFO: Running task 4.0 in stage 82.0 (TID 420) 2023-04-22 21:13:12.021 : INFO: RegionPool: initialized for thread 61: Executor task launch worker for task 4.0 in stage 82.0 (TID 420) 2023-04-22 21:13:12.022 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 61: Executor task launch worker for task 4.0 in stage 82.0 (TID 420) 2023-04-22 21:13:17.845 : INFO: TaskReport: stage=82, partition=4, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:13:17.845 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 61: Executor task launch worker for task 4.0 in stage 82.0 (TID 420) 2023-04-22 21:13:17.934 Executor: INFO: Finished task 4.0 in stage 82.0 (TID 420). 488031 bytes result sent to driver 2023-04-22 21:13:17.934 TaskSetManager: INFO: Starting task 5.0 in stage 82.0 (TID 421) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:13:17.949 TaskSetManager: INFO: Finished task 4.0 in stage 82.0 (TID 420) in 6004 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:13:17.963 Executor: INFO: Running task 5.0 in stage 82.0 (TID 421) 2023-04-22 21:13:18.003 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 5.0 in stage 82.0 (TID 421) 2023-04-22 21:13:18.003 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 359: Executor task launch worker for task 5.0 in stage 82.0 (TID 421) 2023-04-22 21:13:23.707 : INFO: TaskReport: stage=82, partition=5, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:13:23.708 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 359: Executor task launch worker for task 5.0 in stage 82.0 (TID 421) 2023-04-22 21:13:23.752 Executor: INFO: Finished task 5.0 in stage 82.0 (TID 421). 490246 bytes result sent to driver 2023-04-22 21:13:23.753 TaskSetManager: INFO: Starting task 6.0 in stage 82.0 (TID 422) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:13:23.767 TaskSetManager: INFO: Finished task 5.0 in stage 82.0 (TID 421) in 5833 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:13:23.767 Executor: INFO: Running task 6.0 in stage 82.0 (TID 422) 2023-04-22 21:13:23.793 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 6.0 in stage 82.0 (TID 422) 2023-04-22 21:13:23.794 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 82.0 (TID 422) 2023-04-22 21:13:29.501 : INFO: TaskReport: stage=82, partition=6, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14324, cache hits=14323 2023-04-22 21:13:29.501 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 82.0 (TID 422) 2023-04-22 21:13:29.544 Executor: INFO: Finished task 6.0 in stage 82.0 (TID 422). 489844 bytes result sent to driver 2023-04-22 21:13:29.544 TaskSetManager: INFO: Starting task 7.0 in stage 82.0 (TID 423) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4510 bytes) taskResourceAssignments Map() 2023-04-22 21:13:29.558 TaskSetManager: INFO: Finished task 6.0 in stage 82.0 (TID 422) in 5806 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:13:29.559 Executor: INFO: Running task 7.0 in stage 82.0 (TID 423) 2023-04-22 21:13:29.585 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 7.0 in stage 82.0 (TID 423) 2023-04-22 21:13:29.586 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 359: Executor task launch worker for task 7.0 in stage 82.0 (TID 423) 2023-04-22 21:13:35.295 : INFO: TaskReport: stage=82, partition=7, attempt=0, peakBytes=393216, peakBytesReadable=384.00 KiB, chunks requested=14323, cache hits=14322 2023-04-22 21:13:35.295 : INFO: RegionPool: FREE: 384.0K allocated (320.0K blocks / 64.0K chunks), regions.size = 4, 0 current java objects, thread 359: Executor task launch worker for task 7.0 in stage 82.0 (TID 423) 2023-04-22 21:13:35.347 Executor: INFO: Finished task 7.0 in stage 82.0 (TID 423). 489904 bytes result sent to driver 2023-04-22 21:13:35.361 TaskSetManager: INFO: Finished task 7.0 in stage 82.0 (TID 423) in 5817 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:13:35.361 TaskSchedulerImpl: INFO: Removed TaskSet 82.0, whose tasks have all completed, from pool 2023-04-22 21:13:35.361 DAGScheduler: INFO: ResultStage 82 (collect at PCA.scala:51) finished in 48.862 s 2023-04-22 21:13:35.361 DAGScheduler: INFO: Job 42 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:13:35.361 TaskSchedulerImpl: INFO: Killing all running tasks in stage 82: Stage finished 2023-04-22 21:13:35.362 DAGScheduler: INFO: Job 42 finished: collect at PCA.scala:51, took 48.864527 s 2023-04-22 21:13:35.371 MemoryStore: INFO: Block broadcast_145 stored as values in memory (estimated size 39.7 MiB, free 25.1 GiB) 2023-04-22 21:13:35.768 MemoryStore: INFO: Block broadcast_145_piece0 stored as bytes in memory (estimated size 1087.8 KiB, free 25.1 GiB) 2023-04-22 21:13:35.768 BlockManagerInfo: INFO: Added broadcast_145_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 1087.8 KiB, free: 25.3 GiB) 2023-04-22 21:13:35.768 SparkContext: INFO: Created broadcast 145 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:36.442 : INFO: RegionPool: REPORT_THRESHOLD: 320.0K allocated (192.0K blocks / 128.0K chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:36.453 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (320.0K blocks / 192.0K chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:36.558 : INFO: encoder cache miss (10 hits, 9 misses, 0.526) 2023-04-22 21:13:36.561 : INFO: instruction count: 3: __C888HailClassLoaderContainer. 2023-04-22 21:13:36.561 : INFO: instruction count: 3: __C888HailClassLoaderContainer. 2023-04-22 21:13:36.562 : INFO: instruction count: 3: __C890FSContainer. 2023-04-22 21:13:36.562 : INFO: instruction count: 3: __C890FSContainer. 2023-04-22 21:13:36.577 : INFO: instruction count: 3: __C892etypeEncode. 2023-04-22 21:13:36.577 : INFO: instruction count: 7: __C892etypeEncode.apply 2023-04-22 21:13:36.577 : INFO: instruction count: 49: __C892etypeEncode.__m894ENCODE_SBaseStructPointer_TO_r_struct_of_r_array_of_r_float64ANDr_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND 2023-04-22 21:13:36.578 : INFO: instruction count: 39: __C892etypeEncode.__m895ENCODE_SIndexablePointer_TO_r_array_of_r_float64 2023-04-22 21:13:36.578 : INFO: instruction count: 4: __C892etypeEncode.__m896ENCODE_SFloat64$_TO_r_float64 2023-04-22 21:13:36.578 : INFO: instruction count: 35: __C892etypeEncode.__m897ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END 2023-04-22 21:13:36.578 : INFO: instruction count: 37: __C892etypeEncode.__m898ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_array_of_r_float64END 2023-04-22 21:13:36.578 : INFO: instruction count: 16: __C892etypeEncode.__m899ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:13:36.623 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (704.0K blocks / 320.0K chunks), regions.size = 3, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:36.937 : INFO: instruction count: 3: __C900HailClassLoaderContainer. 2023-04-22 21:13:36.937 : INFO: instruction count: 3: __C900HailClassLoaderContainer. 2023-04-22 21:13:36.937 : INFO: instruction count: 3: __C902FSContainer. 2023-04-22 21:13:36.937 : INFO: instruction count: 3: __C902FSContainer. 2023-04-22 21:13:36.996 : INFO: instruction count: 3: __C904indexwriter. 2023-04-22 21:13:36.997 : INFO: instruction count: 367: __C904indexwriter.apply 2023-04-22 21:13:37.002 : INFO: instruction count: 653: __C904indexwriter.__m914writeInternalNode 2023-04-22 21:13:37.002 : INFO: instruction count: 25: __C904indexwriter.__m915ENCODE_SBaseStructPointer_TO_o_struct_of_r_array_of_r_struct_of_r_int64ANDr_int64ANDr_struct_of_o_struct_of_r_binaryANDr_int32ENDANDo_array_of_o_binaryENDANDr_int64ANDr_struct_of_ENDENDEND 2023-04-22 21:13:37.002 : INFO: instruction count: 35: __C904indexwriter.__m916ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_int64ANDr_int64ANDr_struct_of_o_struct_of_r_binaryANDr_int32ENDANDo_array_of_o_binaryENDANDr_int64ANDr_struct_of_ENDEND 2023-04-22 21:13:37.002 : INFO: instruction count: 53: __C904indexwriter.__m917ENCODE_SBaseStructPointer_TO_r_struct_of_r_int64ANDr_int64ANDr_struct_of_o_struct_of_r_binaryANDr_int32ENDANDo_array_of_o_binaryENDANDr_int64ANDr_struct_of_ENDEND 2023-04-22 21:13:37.002 : INFO: instruction count: 4: __C904indexwriter.__m918ENCODE_SInt64$_TO_r_int64 2023-04-22 21:13:37.003 : INFO: instruction count: 96: __C904indexwriter.__m919ENCODE_SBaseStructPointer_TO_r_struct_of_o_struct_of_r_binaryANDr_int32ENDANDo_array_of_o_binaryEND 2023-04-22 21:13:37.003 : INFO: instruction count: 25: __C904indexwriter.__m920ENCODE_SCanonicalLocusPointer_TO_o_struct_of_r_binaryANDr_int32END 2023-04-22 21:13:37.003 : INFO: instruction count: 16: __C904indexwriter.__m921ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:13:37.003 : INFO: instruction count: 4: __C904indexwriter.__m922ENCODE_SInt32$_TO_r_int32 2023-04-22 21:13:37.003 : INFO: instruction count: 115: __C904indexwriter.__m923ENCODE_SIndexablePointer_TO_o_array_of_o_binary 2023-04-22 21:13:37.003 : INFO: instruction count: 16: __C904indexwriter.__m924ENCODE_SStringPointer_TO_o_binary 2023-04-22 21:13:37.003 : INFO: instruction count: 1: __C904indexwriter.__m925ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:13:37.004 : INFO: instruction count: 552: __C904indexwriter.__m927writeLeafNode 2023-04-22 21:13:37.004 : INFO: instruction count: 37: __C904indexwriter.__m928ENCODE_SBaseStructPointer_TO_o_struct_of_r_int64ANDr_array_of_r_struct_of_r_struct_of_o_struct_of_r_binaryANDr_int32ENDANDo_array_of_o_binaryENDANDr_int64ANDr_struct_of_ENDENDEND 2023-04-22 21:13:37.004 : INFO: instruction count: 35: __C904indexwriter.__m929ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_struct_of_o_struct_of_r_binaryANDr_int32ENDANDo_array_of_o_binaryENDANDr_int64ANDr_struct_of_ENDEND 2023-04-22 21:13:37.004 : INFO: instruction count: 29: __C904indexwriter.__m930ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_o_struct_of_r_binaryANDr_int32ENDANDo_array_of_o_binaryENDANDr_int64ANDr_struct_of_ENDEND 2023-04-22 21:13:37.004 : INFO: instruction count: 52: __C904indexwriter.__m931flush 2023-04-22 21:13:37.017 : INFO: instruction count: 192: __C904indexwriter.init 2023-04-22 21:13:37.017 : INFO: instruction count: 28: __C904indexwriter.close 2023-04-22 21:13:37.017 : INFO: instruction count: 7: __C904indexwriter.trackedOS 2023-04-22 21:13:37.017 : INFO: instruction count: 9: __C904indexwriter.setPartitionIndex 2023-04-22 21:13:37.017 : INFO: instruction count: 4: __C904indexwriter.addPartitionRegion 2023-04-22 21:13:37.017 : INFO: instruction count: 4: __C904indexwriter.setPool 2023-04-22 21:13:37.018 : INFO: instruction count: 3: __C904indexwriter.addHailClassLoader 2023-04-22 21:13:37.018 : INFO: instruction count: 3: __C904indexwriter.addFS 2023-04-22 21:13:37.018 : INFO: instruction count: 4: __C904indexwriter.addTaskContext 2023-04-22 21:13:37.019 : INFO: instruction count: 3: __C904indexwriter.setObjects 2023-04-22 21:13:37.019 : INFO: instruction count: 3: __C937__m914writeInternalNodeSpills. 2023-04-22 21:13:37.019 : INFO: instruction count: 3: __C940__m927writeLeafNodeSpills. 2023-04-22 21:13:37.019 : INFO: instruction count: 3: __C932staticWrapperClass_1. 2023-04-22 21:13:37.021 : INFO: encoder cache miss (10 hits, 10 misses, 0.500) 2023-04-22 21:13:37.027 : INFO: instruction count: 3: __C941HailClassLoaderContainer. 2023-04-22 21:13:37.027 : INFO: instruction count: 3: __C941HailClassLoaderContainer. 2023-04-22 21:13:37.028 : INFO: instruction count: 3: __C943FSContainer. 2023-04-22 21:13:37.028 : INFO: instruction count: 3: __C943FSContainer. 2023-04-22 21:13:37.032 : INFO: instruction count: 3: __C945etypeEncode. 2023-04-22 21:13:37.033 : INFO: instruction count: 7: __C945etypeEncode.apply 2023-04-22 21:13:37.033 : INFO: instruction count: 142: __C945etypeEncode.__m947ENCODE_SBaseStructPointer_TO_r_struct_of_o_struct_of_r_binaryANDr_int32ENDANDo_array_of_o_binaryANDo_array_of_o_float64END 2023-04-22 21:13:37.033 : INFO: instruction count: 25: __C945etypeEncode.__m948ENCODE_SCanonicalLocusPointer_TO_o_struct_of_r_binaryANDr_int32END 2023-04-22 21:13:37.033 : INFO: instruction count: 16: __C945etypeEncode.__m949ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:13:37.033 : INFO: instruction count: 4: __C945etypeEncode.__m950ENCODE_SInt32$_TO_r_int32 2023-04-22 21:13:37.033 : INFO: instruction count: 115: __C945etypeEncode.__m951ENCODE_SIndexablePointer_TO_o_array_of_o_binary 2023-04-22 21:13:37.033 : INFO: instruction count: 16: __C945etypeEncode.__m952ENCODE_SStringPointer_TO_o_binary 2023-04-22 21:13:37.033 : INFO: instruction count: 115: __C945etypeEncode.__m953ENCODE_SIndexablePointer_TO_o_array_of_o_float64 2023-04-22 21:13:37.033 : INFO: instruction count: 4: __C945etypeEncode.__m954ENCODE_SFloat64$_TO_o_float64 2023-04-22 21:13:37.380 SparkContext: INFO: Starting job: collect at ContextRDD.scala:176 2023-04-22 21:13:37.380 DAGScheduler: INFO: Job 43 finished: collect at ContextRDD.scala:176, took 0.000206 s 2023-04-22 21:13:37.581 Hail: INFO: wrote table with 0 rows in 0 partitions to /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 Total size: 347.34 KiB * Rows: 0.00 B * Globals: 347.34 KiB * Smallest partition: N/A * Largest partition: N/A 2023-04-22 21:13:37.581 : INFO: took 3m50.5s 2023-04-22 21:13:37.581 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (Begin) 2023-04-22 21:13:37.582 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (Begin) 2023-04-22 21:13:37.583 : INFO: initial IR: IR size 1: (Begin) 2023-04-22 21:13:37.583 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Begin) 2023-04-22 21:13:37.583 : INFO: after InlineApplyIR: IR size 1: (Begin) 2023-04-22 21:13:37.583 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Begin) 2023-04-22 21:13:37.584 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Begin) 2023-04-22 21:13:37.584 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Begin) 2023-04-22 21:13:37.586 : INFO: instruction count: 3: __C955HailClassLoaderContainer. 2023-04-22 21:13:37.586 : INFO: instruction count: 3: __C955HailClassLoaderContainer. 2023-04-22 21:13:37.587 : INFO: instruction count: 3: __C957FSContainer. 2023-04-22 21:13:37.587 : INFO: instruction count: 3: __C957FSContainer. 2023-04-22 21:13:37.587 : INFO: instruction count: 3: __C959Compiled. 2023-04-22 21:13:37.587 : INFO: instruction count: 1: __C959Compiled.apply 2023-04-22 21:13:37.587 : INFO: instruction count: 9: __C959Compiled.setPartitionIndex 2023-04-22 21:13:37.587 : INFO: instruction count: 4: __C959Compiled.addPartitionRegion 2023-04-22 21:13:37.587 : INFO: instruction count: 4: __C959Compiled.setPool 2023-04-22 21:13:37.587 : INFO: instruction count: 3: __C959Compiled.addHailClassLoader 2023-04-22 21:13:37.587 : INFO: instruction count: 3: __C959Compiled.addFS 2023-04-22 21:13:37.587 : INFO: instruction count: 4: __C959Compiled.addTaskContext 2023-04-22 21:13:37.587 : INFO: initial IR: IR size 1: (Begin) 2023-04-22 21:13:37.588 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Begin) 2023-04-22 21:13:37.588 : INFO: after InlineApplyIR: IR size 1: (Begin) 2023-04-22 21:13:37.588 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Begin) 2023-04-22 21:13:37.588 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Begin) 2023-04-22 21:13:37.588 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Begin) 2023-04-22 21:13:37.589 : INFO: instruction count: 3: __C965HailClassLoaderContainer. 2023-04-22 21:13:37.589 : INFO: instruction count: 3: __C965HailClassLoaderContainer. 2023-04-22 21:13:37.589 : INFO: instruction count: 3: __C967FSContainer. 2023-04-22 21:13:37.590 : INFO: instruction count: 3: __C967FSContainer. 2023-04-22 21:13:37.590 : INFO: instruction count: 3: __C969Compiled. 2023-04-22 21:13:37.590 : INFO: instruction count: 1: __C969Compiled.apply 2023-04-22 21:13:37.590 : INFO: instruction count: 9: __C969Compiled.setPartitionIndex 2023-04-22 21:13:37.590 : INFO: instruction count: 4: __C969Compiled.addPartitionRegion 2023-04-22 21:13:37.590 : INFO: instruction count: 4: __C969Compiled.setPool 2023-04-22 21:13:37.590 : INFO: instruction count: 3: __C969Compiled.addHailClassLoader 2023-04-22 21:13:37.590 : INFO: instruction count: 3: __C969Compiled.addFS 2023-04-22 21:13:37.590 : INFO: instruction count: 4: __C969Compiled.addTaskContext 2023-04-22 21:13:37.590 : INFO: initial IR: IR size 1: (Begin) 2023-04-22 21:13:37.591 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Begin) 2023-04-22 21:13:37.608 : INFO: after InlineApplyIR: IR size 1: (Begin) 2023-04-22 21:13:37.608 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Begin) 2023-04-22 21:13:37.608 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Begin) 2023-04-22 21:13:37.609 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Begin) 2023-04-22 21:13:37.609 : INFO: instruction count: 3: __C975HailClassLoaderContainer. 2023-04-22 21:13:37.609 : INFO: instruction count: 3: __C975HailClassLoaderContainer. 2023-04-22 21:13:37.609 : INFO: instruction count: 3: __C977FSContainer. 2023-04-22 21:13:37.609 : INFO: instruction count: 3: __C977FSContainer. 2023-04-22 21:13:37.610 : INFO: instruction count: 3: __C979Compiled. 2023-04-22 21:13:37.610 : INFO: instruction count: 1: __C979Compiled.apply 2023-04-22 21:13:37.610 : INFO: instruction count: 9: __C979Compiled.setPartitionIndex 2023-04-22 21:13:37.610 : INFO: instruction count: 4: __C979Compiled.addPartitionRegion 2023-04-22 21:13:37.610 : INFO: instruction count: 4: __C979Compiled.setPool 2023-04-22 21:13:37.610 : INFO: instruction count: 3: __C979Compiled.addHailClassLoader 2023-04-22 21:13:37.610 : INFO: instruction count: 3: __C979Compiled.addFS 2023-04-22 21:13:37.610 : INFO: instruction count: 4: __C979Compiled.addTaskContext 2023-04-22 21:13:37.613 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:37.613 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:13:37.613 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:37.613 : INFO: finished execution of query hail_query_3, result size is 0.00 B 2023-04-22 21:13:37.613 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:37.613 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:13:37.613 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:37.613 : INFO: RegionPool: FREE: 1.3M allocated (1.0M blocks / 256.0K chunks), regions.size = 3, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode total 3m51.1s self 147.588ms children 3m51.0s %children 99.94% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR total 142.174ms self 0.008ms children 142.166ms %children 99.99% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.052ms self 0.052ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation total 142.004ms self 0.032ms children 141.972ms %children 99.98% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 141.972ms self 0.111ms children 141.860ms %children 99.92% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 1.306ms self 1.306ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 1.611ms self 1.611ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 10.657ms self 10.657ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 41.088ms self 41.088ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 4.432ms self 4.432ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.392ms self 0.392ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 10.571ms self 10.571ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.613 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.305ms self 0.305ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 16.931ms self 16.931ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.651ms self 1.651ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.923ms self 1.923ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 3.449ms self 3.449ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.489ms self 0.489ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 32.668ms self 32.668ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.272ms self 0.272ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.757ms self 0.757ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.480ms self 1.480ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.726ms self 1.726ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 5.636ms self 5.636ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.219ms self 0.219ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 4.297ms self 4.297ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.111ms self 0.111ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable total 7.672ms self 0.006ms children 7.666ms %children 99.92% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/LoweringTransformation total 7.316ms self 7.316ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.313ms self 0.313ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable total 186.496ms self 0.007ms children 186.489ms %children 100.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 186.304ms self 0.020ms children 186.283ms %children 99.99% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 186.283ms self 0.125ms children 186.158ms %children 99.93% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.628ms self 0.628ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 18.310ms self 18.310ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 2.396ms self 2.396ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 79.679ms self 79.679ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 3.417ms self 3.417ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.363ms self 0.363ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 21.536ms self 21.536ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.347ms self 0.347ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.620ms self 0.620ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.202ms self 1.202ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 1.602ms self 1.602ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 27.958ms self 27.958ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.228ms self 0.228ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 3.831ms self 3.831ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.330ms self 0.330ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.581ms self 0.581ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.119ms self 1.119ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 1.201ms self 1.201ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 16.834ms self 16.834ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.225ms self 0.225ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 3.751ms self 3.751ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.126ms self 0.126ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets total 11.485ms self 0.008ms children 11.477ms %children 99.93% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/LoweringTransformation total 11.364ms self 11.364ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.050ms self 0.050ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.614 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets total 0.325ms self 0.005ms children 0.320ms %children 98.53% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation total 0.179ms self 0.179ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles total 0.360ms self 0.005ms children 0.355ms %children 98.50% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.103ms self 0.103ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/LoweringTransformation total 0.146ms self 0.146ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.106ms self 0.106ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles total 83.110ms self 0.006ms children 83.104ms %children 99.99% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 82.937ms self 0.016ms children 82.922ms %children 99.98% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 82.922ms self 0.120ms children 82.801ms %children 99.85% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.198ms self 0.198ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.670ms self 0.670ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.200ms self 1.200ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 1.182ms self 1.182ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 2.771ms self 2.771ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.229ms self 0.229ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 11.850ms self 11.850ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.328ms self 0.328ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 1.110ms self 1.110ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.115ms self 1.115ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 1.753ms self 1.753ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 3.111ms self 3.111ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 8.848ms self 8.848ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 5.226ms self 5.226ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 2.422ms self 2.422ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.564ms self 0.564ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.093ms self 1.093ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 1.205ms self 1.205ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 11.173ms self 11.173ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.276ms self 0.276ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 26.478ms self 26.478ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.127ms self 0.127ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable total 3m50.5s self 0.013ms children 3m50.5s %children 100.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.054ms self 0.054ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation total 3m50.5s self 3m49.8s children 686.673ms %children 0.30% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR total 58.312ms self 0.008ms children 58.304ms %children 99.99% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation total 58.204ms self 0.019ms children 58.185ms %children 99.97% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 58.185ms self 0.123ms children 58.062ms %children 99.79% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.562ms self 0.562ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.222ms self 0.222ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 23.089ms self 23.089ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.532ms self 0.532ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 1.281ms self 1.281ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.127ms self 0.127ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.972ms self 1.972ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.201ms self 0.201ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.208ms self 0.208ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.417ms self 0.417ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.615 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.499ms self 0.499ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 1.220ms self 1.220ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.113ms self 0.113ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.511ms self 1.511ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.176ms self 0.176ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.200ms self 0.200ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.411ms self 0.411ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.493ms self 0.493ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 23.129ms self 23.129ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.115ms self 0.115ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.583ms self 1.583ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR total 0.171ms self 0.005ms children 0.166ms %children 97.02% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.047ms self 0.047ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/LoweringTransformation total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.049ms self 0.049ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR total 29.782ms self 0.005ms children 29.777ms %children 99.98% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 29.688ms self 0.015ms children 29.673ms %children 99.95% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 29.673ms self 0.048ms children 29.625ms %children 99.84% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.097ms self 0.097ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.185ms self 0.185ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.480ms self 0.480ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.476ms self 0.476ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 1.133ms self 1.133ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.113ms self 0.113ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.438ms self 1.438ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.184ms self 0.184ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.181ms self 0.181ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 18.135ms self 18.135ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.477ms self 0.477ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 1.186ms self 1.186ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.117ms self 0.117ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.504ms self 1.504ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.176ms self 0.176ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.179ms self 0.179ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.391ms self 0.391ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.501ms self 0.501ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 1.129ms self 1.129ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.111ms self 0.111ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.431ms self 1.431ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs total 6.056ms self 0.006ms children 6.050ms %children 99.90% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.075ms self 0.075ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/LoweringTransformation total 5.886ms self 5.886ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.089ms self 0.089ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 37.528ms self 0.006ms children 37.522ms %children 99.98% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 36.870ms self 0.014ms children 36.856ms %children 99.96% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 36.856ms self 0.048ms children 36.808ms %children 99.87% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.132ms self 0.132ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.616 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.245ms self 0.245ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.564ms self 0.564ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.602ms self 0.602ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 1.925ms self 1.925ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.155ms self 0.155ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.964ms self 1.964ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.292ms self 0.292ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.242ms self 0.242ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.528ms self 0.528ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.579ms self 0.579ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 17.366ms self 17.366ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.141ms self 0.141ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.857ms self 1.857ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.617 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.287ms self 0.287ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.242ms self 0.242ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.529ms self 0.529ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.578ms self 0.578ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 3.988ms self 3.988ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.710ms self 0.710ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 3.880ms self 3.880ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.623ms self 0.623ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EmitContext.analyze total 17.874ms self 17.874ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR total 27.271ms self 0.006ms children 27.265ms %children 99.98% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation total 27.165ms self 0.015ms children 27.150ms %children 99.95% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 27.150ms self 0.047ms children 27.102ms %children 99.83% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.346ms self 0.346ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.209ms self 0.209ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.415ms self 0.415ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.514ms self 0.514ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 1.165ms self 1.165ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.134ms self 0.134ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.533ms self 1.533ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.185ms self 0.185ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.184ms self 0.184ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.387ms self 0.387ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.455ms self 0.455ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 16.146ms self 16.146ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.140ms self 0.140ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.450ms self 1.450ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.178ms self 0.178ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.189ms self 0.189ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.395ms self 0.395ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.463ms self 0.463ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.646 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 1.115ms self 1.115ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.113ms self 0.113ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.385ms self 1.385ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR total 0.164ms self 0.004ms children 0.159ms %children 97.41% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.047ms self 0.047ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/LoweringTransformation total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.049ms self 0.049ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR total 25.964ms self 0.005ms children 25.959ms %children 99.98% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 25.871ms self 0.015ms children 25.857ms %children 99.94% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 25.857ms self 0.046ms children 25.810ms %children 99.82% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.181ms self 0.181ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.412ms self 0.412ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.460ms self 0.460ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 1.147ms self 1.147ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.112ms self 0.112ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.391ms self 1.391ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.172ms self 0.172ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.178ms self 0.178ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.392ms self 0.392ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.436ms self 0.436ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 1.112ms self 1.112ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.110ms self 0.110ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 15.485ms self 15.485ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.177ms self 0.177ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.184ms self 0.184ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.385ms self 0.385ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.455ms self 0.455ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 1.395ms self 1.395ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.111ms self 0.111ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.414ms self 1.414ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.069ms self 0.069ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs total 2.803ms self 0.006ms children 2.797ms %children 99.79% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.044ms self 0.044ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/LoweringTransformation total 2.664ms self 2.664ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.089ms self 0.089ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 36.745ms self 0.006ms children 36.739ms %children 99.98% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 36.628ms self 0.014ms children 36.614ms %children 99.96% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 36.614ms self 0.052ms children 36.562ms %children 99.86% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.128ms self 0.128ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.255ms self 0.255ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.531ms self 0.531ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.548ms self 0.548ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 4.763ms self 4.763ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.154ms self 0.154ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 2.582ms self 2.582ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.647 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.282ms self 0.282ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.245ms self 0.245ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.524ms self 0.524ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.724ms self 0.724ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 3.388ms self 3.388ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.144ms self 0.144ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 2.199ms self 2.199ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.295ms self 0.295ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.543ms self 0.543ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.536ms self 0.536ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.548ms self 0.548ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 14.950ms self 14.950ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.145ms self 0.145ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 3.077ms self 3.077ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.085ms self 0.085ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EmitContext.analyze total 4.538ms self 4.538ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR total 25.735ms self 0.007ms children 25.728ms %children 99.97% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation total 25.631ms self 0.016ms children 25.615ms %children 99.94% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 25.615ms self 0.048ms children 25.567ms %children 99.81% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.325ms self 0.325ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.205ms self 0.205ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.449ms self 0.449ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.462ms self 0.462ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 1.078ms self 1.078ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.115ms self 0.115ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.419ms self 1.419ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.174ms self 0.174ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.179ms self 0.179ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.399ms self 0.399ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.441ms self 0.441ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 1.019ms self 1.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.111ms self 0.111ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.372ms self 1.372ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.176ms self 0.176ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.180ms self 0.180ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 14.444ms self 14.444ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.468ms self 0.468ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 1.065ms self 1.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.133ms self 0.133ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.351ms self 1.351ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR total 0.152ms self 0.004ms children 0.148ms %children 97.28% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.046ms self 0.046ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/LoweringTransformation total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR total 24.210ms self 0.006ms children 24.204ms %children 99.97% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 24.119ms self 0.017ms children 24.102ms %children 99.93% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 24.102ms self 0.045ms children 24.057ms %children 99.81% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.648 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.191ms self 0.191ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.389ms self 0.389ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.440ms self 0.440ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 1.030ms self 1.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.136ms self 0.136ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 14.423ms self 14.423ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.183ms self 0.183ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.196ms self 0.196ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.401ms self 0.401ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.432ms self 0.432ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 1.033ms self 1.033ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.113ms self 0.113ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.358ms self 1.358ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.169ms self 0.169ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.176ms self 0.176ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.386ms self 0.386ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.440ms self 0.440ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 1.011ms self 1.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.112ms self 0.112ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.338ms self 1.338ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.067ms self 0.067ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs total 2.478ms self 0.006ms children 2.472ms %children 99.75% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.047ms self 0.047ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/LoweringTransformation total 2.341ms self 2.341ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.084ms self 0.084ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 18.537ms self 0.005ms children 18.532ms %children 99.97% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 18.425ms self 0.015ms children 18.410ms %children 99.92% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 18.410ms self 0.047ms children 18.363ms %children 99.75% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.123ms self 0.123ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.241ms self 0.241ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.535ms self 0.535ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.549ms self 0.549ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 1.789ms self 1.789ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.144ms self 0.144ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.712ms self 1.712ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.266ms self 0.266ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.243ms self 0.243ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.525ms self 0.525ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.549ms self 0.549ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 4.649ms self 4.649ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.147ms self 0.147ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.766ms self 1.766ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.263ms self 0.263ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.252ms self 0.252ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.521ms self 0.521ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.559ms self 0.559ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 1.728ms self 1.728ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.146ms self 0.146ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.656ms self 1.656ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.080ms self 0.080ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EmitContext.analyze total 1.735ms self 1.735ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.649 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR total 3.992ms self 0.008ms children 3.984ms %children 99.79% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation total 3.955ms self 0.016ms children 3.940ms %children 99.60% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 3.940ms self 0.054ms children 3.885ms %children 98.63% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.204ms self 0.204ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.083ms self 0.083ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.165ms self 0.165ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.273ms self 0.273ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.301ms self 0.301ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.044ms self 0.044ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.461ms self 0.461ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.068ms self 0.068ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.138ms self 0.138ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.190ms self 0.190ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.283ms self 0.283ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.398ms self 0.398ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.130ms self 0.130ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.177ms self 0.177ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.319ms self 0.319ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.400ms self 0.400ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR total 0.085ms self 0.005ms children 0.080ms %children 94.48% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/LoweringTransformation total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR total 4.516ms self 0.006ms children 4.510ms %children 99.88% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 4.488ms self 0.011ms children 4.477ms %children 99.76% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 4.477ms self 0.040ms children 4.437ms %children 99.11% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.050ms self 0.050ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.067ms self 0.067ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.134ms self 0.134ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 1.119ms self 1.119ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.331ms self 0.331ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.409ms self 0.409ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.058ms self 0.058ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.141ms self 0.141ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.208ms self 0.208ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.274ms self 0.274ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.379ms self 0.379ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.056ms self 0.056ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.068ms self 0.068ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.130ms self 0.130ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.177ms self 0.177ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.650 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.277ms self 0.277ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.381ms self 0.381ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs total 0.629ms self 0.005ms children 0.624ms %children 99.14% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/LoweringTransformation total 0.574ms self 0.574ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 8.318ms self 0.005ms children 8.313ms %children 99.94% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 8.289ms self 0.011ms children 8.278ms %children 99.87% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 8.278ms self 0.044ms children 8.235ms %children 99.47% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.048ms self 0.048ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.145ms self 0.145ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.185ms self 0.185ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.276ms self 0.276ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.992ms self 1.992ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.075ms self 0.075ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.136ms self 0.136ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.193ms self 0.193ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.780ms self 0.780ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.049ms self 0.049ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.452ms self 0.452ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.056ms self 0.056ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.131ms self 0.131ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.177ms self 0.177ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 1.795ms self 1.795ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 1.087ms self 1.087ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.422ms self 0.422ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EmitContext.analyze total 0.515ms self 0.515ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR total 4.854ms self 0.014ms children 4.840ms %children 99.70% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation total 4.817ms self 0.010ms children 4.807ms %children 99.79% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 4.807ms self 0.055ms children 4.752ms %children 98.85% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.131ms self 0.131ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.191ms self 0.191ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.184ms self 0.184ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.768ms self 0.768ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.396ms self 0.396ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.139ms self 0.139ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.182ms self 0.182ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.267ms self 0.267ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.651 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.369ms self 0.369ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.237ms self 0.237ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.465ms self 0.465ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.182ms self 0.182ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.276ms self 0.276ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.172ms self 0.172ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.407ms self 0.407ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR total 0.277ms self 0.006ms children 0.271ms %children 97.97% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/LoweringTransformation total 0.232ms self 0.232ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR total 4.297ms self 0.005ms children 4.292ms %children 99.89% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 4.271ms self 0.142ms children 4.129ms %children 96.67% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 4.129ms self 0.039ms children 4.090ms %children 99.05% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.132ms self 0.132ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.189ms self 0.189ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.269ms self 0.269ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.374ms self 0.374ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.125ms self 0.125ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.897ms self 0.897ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.308ms self 0.308ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.380ms self 0.380ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.056ms self 0.056ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.124ms self 0.124ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.213ms self 0.213ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.266ms self 0.266ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.364ms self 0.364ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs total 13.709ms self 0.006ms children 13.703ms %children 99.96% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/LoweringTransformation total 13.653ms self 13.653ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 5.479ms self 0.006ms children 5.474ms %children 99.90% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 5.449ms self 0.009ms children 5.440ms %children 99.83% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 5.440ms self 0.037ms children 5.402ms %children 99.31% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.052ms self 0.052ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.133ms self 0.133ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.652 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.178ms self 0.178ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.282ms self 0.282ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.256ms self 1.256ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.129ms self 0.129ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.173ms self 0.173ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.275ms self 0.275ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.409ms self 0.409ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.056ms self 0.056ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.126ms self 0.126ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.184ms self 0.184ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.263ms self 0.263ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.543ms self 1.543ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EmitContext.analyze total 0.464ms self 0.464ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR total 3.523ms self 0.005ms children 3.518ms %children 99.85% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation total 3.495ms self 0.010ms children 3.485ms %children 99.71% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 3.485ms self 0.038ms children 3.448ms %children 98.92% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.129ms self 0.129ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.151ms self 0.151ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.178ms self 0.178ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.373ms self 0.373ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.386ms self 0.386ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.054ms self 0.054ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.125ms self 0.125ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.177ms self 0.177ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.260ms self 0.260ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.366ms self 0.366ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.053ms self 0.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.126ms self 0.126ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.171ms self 0.171ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.261ms self 0.261ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.367ms self 0.367ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR total 0.068ms self 0.004ms children 0.063ms %children 93.40% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/LoweringTransformation total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR total 22.460ms self 0.006ms children 22.455ms %children 99.98% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.653 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 22.433ms self 0.010ms children 22.423ms %children 99.96% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 22.423ms self 0.040ms children 22.383ms %children 99.82% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.047ms self 0.047ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.129ms self 0.129ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.189ms self 0.189ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.258ms self 0.258ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.363ms self 0.363ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.054ms self 0.054ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.054ms self 0.054ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.124ms self 0.124ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.183ms self 0.183ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.250ms self 0.250ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.359ms self 0.359ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.053ms self 0.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.053ms self 0.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 19.151ms self 19.151ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.241ms self 0.241ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.330ms self 0.330ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.388ms self 0.388ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs total 0.559ms self 0.005ms children 0.554ms %children 99.12% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/LoweringTransformation total 0.507ms self 0.507ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 3.271ms self 0.005ms children 3.265ms %children 99.84% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 3.245ms self 0.010ms children 3.235ms %children 99.71% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 3.235ms self 0.039ms children 3.196ms %children 98.80% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.050ms self 0.050ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.127ms self 0.127ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.190ms self 0.190ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.256ms self 0.256ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.368ms self 0.368ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.056ms self 0.056ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.125ms self 0.125ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.182ms self 0.182ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.257ms self 0.257ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.352ms self 0.352ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.058ms self 0.058ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.054ms self 0.054ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.133ms self 0.133ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.171ms self 0.171ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.248ms self 0.248ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.654 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.355ms self 0.355ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EmitContext.analyze total 0.462ms self 0.462ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR total 14.148ms self 0.006ms children 14.141ms %children 99.96% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation total 14.075ms self 0.016ms children 14.059ms %children 99.89% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 14.059ms self 0.051ms children 14.008ms %children 99.64% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.376ms self 0.376ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.180ms self 0.180ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.375ms self 0.375ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.613ms self 0.613ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.997ms self 0.997ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.655 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.215ms self 1.215ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.130ms self 0.130ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.156ms self 0.156ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.336ms self 0.336ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.428ms self 0.428ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.954ms self 0.954ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.062ms self 1.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.125ms self 0.125ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.154ms self 0.154ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.346ms self 0.346ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.424ms self 0.424ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.924ms self 0.924ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 3.933ms self 3.933ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.149ms self 1.149ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR total 0.157ms self 0.004ms children 0.152ms %children 97.16% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/LoweringTransformation total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR total 55.305ms self 0.006ms children 55.300ms %children 99.99% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 55.244ms self 0.013ms children 55.231ms %children 99.98% 2023-04-22 21:13:37.673 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 55.231ms self 0.043ms children 55.188ms %children 99.92% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.113ms self 0.113ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.161ms self 0.161ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.335ms self 0.335ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.431ms self 0.431ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.885ms self 0.885ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.053ms self 1.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.138ms self 0.138ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.156ms self 0.156ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.332ms self 0.332ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.394ms self 0.394ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.872ms self 0.872ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.032ms self 1.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.135ms self 0.135ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.155ms self 0.155ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.327ms self 0.327ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.389ms self 0.389ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 47.030ms self 47.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.055ms self 1.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs total 3.473ms self 0.005ms children 3.468ms %children 99.85% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.043ms self 0.043ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/LoweringTransformation total 3.318ms self 3.318ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.107ms self 0.107ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 14.865ms self 0.005ms children 14.860ms %children 99.96% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 14.804ms self 0.013ms children 14.791ms %children 99.91% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 14.791ms self 5.909ms children 8.882ms %children 60.05% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.102ms self 0.102ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.171ms self 0.171ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.335ms self 0.335ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.409ms self 0.409ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.843ms self 0.843ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.022ms self 1.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.125ms self 0.125ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.159ms self 0.159ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.331ms self 0.331ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.430ms self 0.430ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.827ms self 0.827ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.010ms self 1.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.166ms self 0.166ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.163ms self 0.163ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.331ms self 0.331ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.674 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.400ms self 0.400ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.845ms self 0.845ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.014ms self 1.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EmitContext.analyze total 1.276ms self 1.276ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR total 34.417ms self 0.006ms children 34.410ms %children 99.98% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation total 34.341ms self 0.025ms children 34.316ms %children 99.93% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 34.316ms self 0.046ms children 34.269ms %children 99.86% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.266ms self 0.266ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.176ms self 0.176ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.337ms self 0.337ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 16.475ms self 16.475ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.907ms self 0.907ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.067ms self 0.067ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.061ms self 1.061ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.127ms self 0.127ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.153ms self 0.153ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.344ms self 0.344ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.395ms self 0.395ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.845ms self 0.845ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.023ms self 1.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.132ms self 0.132ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.152ms self 0.152ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.335ms self 0.335ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.397ms self 0.397ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.838ms self 0.838ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 10.115ms self 10.115ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.045ms self 0.045ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR total 0.152ms self 0.004ms children 0.148ms %children 97.10% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.049ms self 0.049ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/LoweringTransformation total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR total 31.020ms self 0.005ms children 31.015ms %children 99.98% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 30.959ms self 0.013ms children 30.946ms %children 99.96% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 30.946ms self 0.054ms children 30.892ms %children 99.83% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.097ms self 0.097ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.160ms self 0.160ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.329ms self 0.329ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.439ms self 0.439ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.844ms self 0.844ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.025ms self 1.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.151ms self 0.151ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.675 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.330ms self 0.330ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.386ms self 0.386ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.823ms self 0.823ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.007ms self 1.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 22.161ms self 22.161ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.185ms self 0.185ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.338ms self 0.338ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.395ms self 0.395ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.843ms self 0.843ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.053ms self 1.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs total 1.430ms self 0.005ms children 1.425ms %children 99.66% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.043ms self 0.043ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/LoweringTransformation total 1.318ms self 1.318ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 10.952ms self 0.005ms children 10.947ms %children 99.95% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 10.889ms self 0.013ms children 10.876ms %children 99.88% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 10.876ms self 0.043ms children 10.833ms %children 99.61% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.166ms self 0.166ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.329ms self 0.329ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.395ms self 0.395ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 2.893ms self 2.893ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.042ms self 1.042ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.123ms self 0.123ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.162ms self 0.162ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.338ms self 0.338ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.385ms self 0.385ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.788ms self 0.788ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.002ms self 1.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.125ms self 0.125ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.150ms self 0.150ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.338ms self 0.338ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.391ms self 0.391ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.813ms self 0.813ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.097ms self 1.097ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EmitContext.analyze total 1.265ms self 1.265ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR total 59.155ms self 0.006ms children 59.149ms %children 99.99% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation total 59.080ms self 0.014ms children 59.066ms %children 99.98% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 59.066ms self 0.055ms children 59.012ms %children 99.91% 2023-04-22 21:13:37.676 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.273ms self 0.273ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 46.003ms self 46.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.459ms self 0.459ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.459ms self 0.459ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.822ms self 0.822ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.223ms self 1.223ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 2.048ms self 2.048ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.355ms self 0.355ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.414ms self 0.414ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.773ms self 0.773ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.522ms self 1.522ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.128ms self 0.128ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.154ms self 0.154ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.336ms self 0.336ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.916ms self 0.916ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.791ms self 0.791ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 1.007ms self 1.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.061ms self 1.061ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, initial IR/Verify total 0.047ms self 0.047ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR total 0.539ms self 0.007ms children 0.532ms %children 98.68% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/LoweringTransformation total 0.446ms self 0.446ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InlineApplyIR/Verify total 0.046ms self 0.046ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR total 10.167ms self 0.006ms children 10.161ms %children 99.94% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 10.107ms self 0.014ms children 10.093ms %children 99.87% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 10.093ms self 0.046ms children 10.047ms %children 99.54% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.158ms self 0.158ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.338ms self 0.338ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.443ms self 0.443ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.775ms self 0.775ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.025ms self 1.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.124ms self 0.124ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.149ms self 0.149ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.330ms self 0.330ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.397ms self 0.397ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.748ms self 0.748ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.386ms self 1.386ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.335ms self 0.335ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.160ms self 0.160ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.342ms self 0.342ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.392ms self 0.392ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 1.570ms self 1.570ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.677 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.079ms self 0.079ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.067ms self 1.067ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after InlineApplyIR/Verify total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs total 10.297ms self 0.009ms children 10.288ms %children 99.91% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.044ms self 0.044ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/LoweringTransformation total 10.201ms self 10.201ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerArrayAggsToRunAggs/Verify total 0.043ms self 0.043ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 39.249ms self 0.009ms children 39.239ms %children 99.98% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 39.186ms self 0.014ms children 39.172ms %children 99.97% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 39.172ms self 0.048ms children 39.124ms %children 99.88% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.096ms self 0.096ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.157ms self 0.157ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.342ms self 0.342ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.392ms self 0.392ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.767ms self 0.767ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 31.656ms self 31.656ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.126ms self 0.126ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.154ms self 0.154ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.330ms self 0.330ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.402ms self 0.402ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.757ms self 0.757ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 1.015ms self 1.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.122ms self 0.122ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.148ms self 0.148ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.325ms self 0.325ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.396ms self 0.396ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.749ms self 0.749ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.994ms self 0.994ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EmitContext.analyze total 1.276ms self 1.276ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.503ms self 0.009ms children 0.493ms %children 98.18% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.485ms self 0.018ms children 0.467ms %children 96.32% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.467ms self 0.037ms children 0.430ms %children 92.14% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.069ms self 0.069ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.109ms self 0.109ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.078ms self 0.078ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/Compile total 27.736ms self 25.643ms children 2.093ms %children 7.55% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.262ms self 0.006ms children 0.256ms %children 97.53% 2023-04-22 21:13:37.678 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.250ms self 0.012ms children 0.238ms %children 95.25% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.238ms self 0.016ms children 0.222ms %children 93.38% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.049ms self 0.049ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.020ms self 0.003ms children 0.017ms %children 82.98% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.187ms self 0.003ms children 0.184ms %children 98.16% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.180ms self 0.007ms children 0.173ms %children 95.91% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.173ms self 0.013ms children 0.160ms %children 92.76% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.056ms self 0.056ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.079ms self 0.003ms children 0.075ms %children 95.68% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.177ms self 0.003ms children 0.174ms %children 98.09% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.171ms self 0.007ms children 0.164ms %children 95.85% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.164ms self 0.010ms children 0.153ms %children 93.64% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.679 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.045ms self 0.045ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.196ms self 0.005ms children 0.191ms %children 97.69% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.186ms self 0.008ms children 0.178ms %children 95.90% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.178ms self 0.011ms children 0.167ms %children 93.68% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.027ms self 0.013ms children 0.014ms %children 51.41% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.169ms self 0.003ms children 0.165ms %children 98.01% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.162ms self 0.007ms children 0.155ms %children 95.69% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.155ms self 0.011ms children 0.144ms %children 92.70% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.059ms self 0.003ms children 0.056ms %children 94.52% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.049ms self 0.049ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.162ms self 0.003ms children 0.159ms %children 98.13% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.155ms self 0.007ms children 0.149ms %children 95.62% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.149ms self 0.010ms children 0.139ms %children 93.30% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.680 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.188ms self 0.004ms children 0.183ms %children 97.74% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.178ms self 0.007ms children 0.171ms %children 95.85% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.171ms self 0.011ms children 0.160ms %children 93.39% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.016ms self 0.003ms children 0.013ms %children 80.02% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.195ms self 0.005ms children 0.190ms %children 97.53% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.184ms self 0.008ms children 0.176ms %children 95.50% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.176ms self 0.012ms children 0.164ms %children 93.23% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.062ms self 0.003ms children 0.059ms %children 94.78% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.050ms self 0.050ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.168ms self 0.003ms children 0.165ms %children 98.04% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.161ms self 0.007ms children 0.154ms %children 95.57% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.154ms self 0.010ms children 0.144ms %children 93.29% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.053ms self 0.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.043ms self 0.043ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/InitializeCompiledFunction total 1.255ms self 1.255ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.681 : INFO: timing SparkBackend.executeEncode/RunCompiledVoidFunction total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.694 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:37.864 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:37.864 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:13:37.864 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:37.864 : INFO: RegionPool: FREE: 64.0K allocated (64.0K blocks / 0 chunks), regions.size = 1, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:37.864 : INFO: timing SparkBackend.parse_value_ir total 170.272ms self 170.272ms children 0.000ms %children 0.00% 2023-04-22 21:13:37.867 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:37.867 : INFO: starting execution of query hail_query_4 of initial size 6 2023-04-22 21:13:37.869 : INFO: initial IR: IR size 6: (Let __rng_state (RNGStateLiteral) (MakeTuple (0) (GetField eigenvalues (TableGetGlobals (TableRead Table{global:Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],loadings:Array[Float64]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 )))))) 2023-04-22 21:13:37.871 : INFO: after optimize: relationalLowerer, initial IR: IR size 4: (MakeTuple (0) (GetField eigenvalues (TableGetGlobals (TableRead Table{global:Struct{eigenvalues:Array[Float64]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))))) 2023-04-22 21:13:37.872 : INFO: after LowerMatrixToTable: IR size 4: (MakeTuple (0) (GetField eigenvalues (TableGetGlobals (TableRead Table{global:Struct{eigenvalues:Array[Float64]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))))) 2023-04-22 21:13:37.873 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 4: (MakeTuple (0) (GetField eigenvalues (TableGetGlobals (TableRead Table{global:Struct{eigenvalues:Array[Float64]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))))) 2023-04-22 21:13:37.873 : INFO: after LiftRelationalValuesToRelationalLets: IR size 6: (RelationalLet __iruid_991 (TableGetGlobals (TableRead Table{global:Struct{eigenvalues:Array[Float64]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) (MakeTuple (0) (GetField eigenvalues (RelationalRef __iruid_991 Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:37.874 : INFO: initial IR: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{eigenvalues:Array[Float64]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) 2023-04-22 21:13:37.874 : INFO: after LowerAndExecuteShuffles: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{eigenvalues:Array[Float64]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) 2023-04-22 21:13:37.874 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{eigenvalues:Array[Float64]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) 2023-04-22 21:13:37.875 : INFO: LowerOrInterpretNonCompilable: whole stage code generation is a go! 2023-04-22 21:13:37.875 : INFO: lowering result: TableGetGlobals 2023-04-22 21:13:37.913 : INFO: compiling and evaluating result: TableGetGlobals 2023-04-22 21:13:37.934 : INFO: initial IR: IR size 9: (Let __iruid_992 (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (MakeStruct (partitionIndex (I64 0)) (partitionPath (Str "/fg/saxena..."))))) (I32 0)) (Ref __iruid_992)) 2023-04-22 21:13:37.937 : INFO: after optimize: relationalLowerer, initial IR: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:37.938 : INFO: after LowerMatrixToTable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:37.940 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:37.941 : INFO: after LiftRelationalValuesToRelationalLets: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:37.942 : INFO: after EvalRelationalLets: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:37.965 : INFO: after LowerAndExecuteShuffles: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:37.967 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:37.968 : INFO: after LowerOrInterpretNonCompilable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:37.969 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:37.971 : INFO: initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:37.973 : INFO: after optimize: compileLowerer, initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.004 : INFO: after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.007 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.008 : INFO: after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.010 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.051 : INFO: encoder cache miss (10 hits, 11 misses, 0.476) 2023-04-22 21:13:38.052 : INFO: instruction count: 3: __C1023HailClassLoaderContainer. 2023-04-22 21:13:38.052 : INFO: instruction count: 3: __C1023HailClassLoaderContainer. 2023-04-22 21:13:38.053 : INFO: instruction count: 3: __C1025FSContainer. 2023-04-22 21:13:38.053 : INFO: instruction count: 3: __C1025FSContainer. 2023-04-22 21:13:38.053 : INFO: instruction count: 3: __C1027etypeEncode. 2023-04-22 21:13:38.053 : INFO: instruction count: 7: __C1027etypeEncode.apply 2023-04-22 21:13:38.053 : INFO: instruction count: 9: __C1027etypeEncode.__m1029ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_int64ANDr_binaryENDEND 2023-04-22 21:13:38.053 : INFO: instruction count: 25: __C1027etypeEncode.__m1030ENCODE_SBaseStructPointer_TO_r_struct_of_r_int64ANDr_binaryEND 2023-04-22 21:13:38.053 : INFO: instruction count: 4: __C1027etypeEncode.__m1031ENCODE_SInt64$_TO_r_int64 2023-04-22 21:13:38.054 : INFO: instruction count: 16: __C1027etypeEncode.__m1032ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:13:38.055 MemoryStore: INFO: Block broadcast_146 stored as values in memory (estimated size 216.0 B, free 25.1 GiB) 2023-04-22 21:13:38.084 MemoryStore: INFO: Block broadcast_146_piece0 stored as bytes in memory (estimated size 162.0 B, free 25.1 GiB) 2023-04-22 21:13:38.084 BlockManagerInfo: INFO: Added broadcast_146_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:13:38.085 SparkContext: INFO: Created broadcast 146 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:38.085 : INFO: instruction count: 3: __C985HailClassLoaderContainer. 2023-04-22 21:13:38.085 : INFO: instruction count: 3: __C985HailClassLoaderContainer. 2023-04-22 21:13:38.085 : INFO: instruction count: 3: __C987FSContainer. 2023-04-22 21:13:38.085 : INFO: instruction count: 3: __C987FSContainer. 2023-04-22 21:13:38.089 : INFO: instruction count: 3: __C989Compiled. 2023-04-22 21:13:38.089 : INFO: instruction count: 45: __C989Compiled.apply 2023-04-22 21:13:38.089 : INFO: instruction count: 345: __C989Compiled.__m991split_ToArray 2023-04-22 21:13:38.090 : INFO: instruction count: 31: __C989Compiled.__m999DECODE_r_struct_of_r_array_of_r_float64ANDr_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:38.090 : INFO: instruction count: 58: __C989Compiled.__m1000INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:38.090 : INFO: instruction count: 10: __C989Compiled.__m1001INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:38.090 : INFO: instruction count: 22: __C989Compiled.__m1002SKIP_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END 2023-04-22 21:13:38.090 : INFO: instruction count: 16: __C989Compiled.__m1003SKIP_r_struct_of_r_binaryANDr_array_of_r_float64END 2023-04-22 21:13:38.090 : INFO: instruction count: 7: __C989Compiled.__m1004SKIP_r_binary 2023-04-22 21:13:38.090 : INFO: instruction count: 22: __C989Compiled.__m1005SKIP_r_array_of_r_float64 2023-04-22 21:13:38.090 : INFO: instruction count: 3: __C989Compiled.__m1006SKIP_r_float64 2023-04-22 21:13:38.090 : INFO: instruction count: 12: __C989Compiled.__m1009setup_jab 2023-04-22 21:13:38.090 : INFO: instruction count: 35: __C989Compiled.__m1014arrayref_bounds_check 2023-04-22 21:13:38.090 : INFO: instruction count: 9: __C989Compiled.setPartitionIndex 2023-04-22 21:13:38.090 : INFO: instruction count: 4: __C989Compiled.addPartitionRegion 2023-04-22 21:13:38.090 : INFO: instruction count: 4: __C989Compiled.setPool 2023-04-22 21:13:38.090 : INFO: instruction count: 3: __C989Compiled.addHailClassLoader 2023-04-22 21:13:38.090 : INFO: instruction count: 3: __C989Compiled.addFS 2023-04-22 21:13:38.090 : INFO: instruction count: 4: __C989Compiled.addTaskContext 2023-04-22 21:13:38.090 : INFO: instruction count: 41: __C989Compiled.addAndDecodeLiterals 2023-04-22 21:13:38.090 : INFO: instruction count: 27: __C989Compiled.__m1019DECODE_r_struct_of_r_struct_of_r_int64ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:38.091 : INFO: instruction count: 26: __C989Compiled.__m1020INPLACE_DECODE_r_struct_of_r_int64ANDr_binaryEND_TO_r_struct_of_r_int64ANDr_stringEND 2023-04-22 21:13:38.091 : INFO: instruction count: 10: __C989Compiled.__m1021INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:13:38.091 : INFO: instruction count: 31: __C989Compiled.__m1022INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:38.092 : INFO: initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.094 : INFO: after optimize: compileLowerer, initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.095 : INFO: after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.097 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.100 : INFO: after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.102 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.128 : INFO: encoder cache hit 2023-04-22 21:13:38.129 MemoryStore: INFO: Block broadcast_147 stored as values in memory (estimated size 216.0 B, free 25.1 GiB) 2023-04-22 21:13:38.130 MemoryStore: INFO: Block broadcast_147_piece0 stored as bytes in memory (estimated size 162.0 B, free 25.1 GiB) 2023-04-22 21:13:38.131 BlockManagerInfo: INFO: Added broadcast_147_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:13:38.145 SparkContext: INFO: Created broadcast 147 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:38.145 : INFO: instruction count: 3: __C1033HailClassLoaderContainer. 2023-04-22 21:13:38.145 : INFO: instruction count: 3: __C1033HailClassLoaderContainer. 2023-04-22 21:13:38.145 : INFO: instruction count: 3: __C1035FSContainer. 2023-04-22 21:13:38.145 : INFO: instruction count: 3: __C1035FSContainer. 2023-04-22 21:13:38.149 : INFO: instruction count: 3: __C1037Compiled. 2023-04-22 21:13:38.149 : INFO: instruction count: 45: __C1037Compiled.apply 2023-04-22 21:13:38.160 : INFO: instruction count: 345: __C1037Compiled.__m1039split_ToArray 2023-04-22 21:13:38.160 : INFO: instruction count: 31: __C1037Compiled.__m1047DECODE_r_struct_of_r_array_of_r_float64ANDr_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:38.160 : INFO: instruction count: 58: __C1037Compiled.__m1048INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:38.161 : INFO: instruction count: 10: __C1037Compiled.__m1049INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:38.161 : INFO: instruction count: 22: __C1037Compiled.__m1050SKIP_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END 2023-04-22 21:13:38.161 : INFO: instruction count: 16: __C1037Compiled.__m1051SKIP_r_struct_of_r_binaryANDr_array_of_r_float64END 2023-04-22 21:13:38.161 : INFO: instruction count: 7: __C1037Compiled.__m1052SKIP_r_binary 2023-04-22 21:13:38.161 : INFO: instruction count: 22: __C1037Compiled.__m1053SKIP_r_array_of_r_float64 2023-04-22 21:13:38.161 : INFO: instruction count: 3: __C1037Compiled.__m1054SKIP_r_float64 2023-04-22 21:13:38.161 : INFO: instruction count: 12: __C1037Compiled.__m1057setup_jab 2023-04-22 21:13:38.161 : INFO: instruction count: 35: __C1037Compiled.__m1062arrayref_bounds_check 2023-04-22 21:13:38.161 : INFO: instruction count: 9: __C1037Compiled.setPartitionIndex 2023-04-22 21:13:38.161 : INFO: instruction count: 4: __C1037Compiled.addPartitionRegion 2023-04-22 21:13:38.161 : INFO: instruction count: 4: __C1037Compiled.setPool 2023-04-22 21:13:38.161 : INFO: instruction count: 3: __C1037Compiled.addHailClassLoader 2023-04-22 21:13:38.161 : INFO: instruction count: 3: __C1037Compiled.addFS 2023-04-22 21:13:38.161 : INFO: instruction count: 4: __C1037Compiled.addTaskContext 2023-04-22 21:13:38.161 : INFO: instruction count: 41: __C1037Compiled.addAndDecodeLiterals 2023-04-22 21:13:38.161 : INFO: instruction count: 27: __C1037Compiled.__m1067DECODE_r_struct_of_r_struct_of_r_int64ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:38.161 : INFO: instruction count: 26: __C1037Compiled.__m1068INPLACE_DECODE_r_struct_of_r_int64ANDr_binaryEND_TO_r_struct_of_r_int64ANDr_stringEND 2023-04-22 21:13:38.161 : INFO: instruction count: 10: __C1037Compiled.__m1069INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:13:38.161 : INFO: instruction count: 31: __C1037Compiled.__m1070INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:38.162 : INFO: initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.164 : INFO: after optimize: compileLowerer, initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.165 : INFO: after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.167 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.168 : INFO: after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.177 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{eigenvalues:Array[Float64]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:38.184 : INFO: encoder cache hit 2023-04-22 21:13:38.184 MemoryStore: INFO: Block broadcast_148 stored as values in memory (estimated size 216.0 B, free 25.1 GiB) 2023-04-22 21:13:38.199 MemoryStore: INFO: Block broadcast_148_piece0 stored as bytes in memory (estimated size 162.0 B, free 25.1 GiB) 2023-04-22 21:13:38.199 BlockManagerInfo: INFO: Added broadcast_148_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:13:38.199 SparkContext: INFO: Created broadcast 148 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:38.199 : INFO: instruction count: 3: __C1071HailClassLoaderContainer. 2023-04-22 21:13:38.199 : INFO: instruction count: 3: __C1071HailClassLoaderContainer. 2023-04-22 21:13:38.210 : INFO: instruction count: 3: __C1073FSContainer. 2023-04-22 21:13:38.210 : INFO: instruction count: 3: __C1073FSContainer. 2023-04-22 21:13:38.213 : INFO: instruction count: 3: __C1075Compiled. 2023-04-22 21:13:38.213 : INFO: instruction count: 45: __C1075Compiled.apply 2023-04-22 21:13:38.214 : INFO: instruction count: 345: __C1075Compiled.__m1077split_ToArray 2023-04-22 21:13:38.214 : INFO: instruction count: 31: __C1075Compiled.__m1085DECODE_r_struct_of_r_array_of_r_float64ANDr_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:38.214 : INFO: instruction count: 58: __C1075Compiled.__m1086INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:38.214 : INFO: instruction count: 10: __C1075Compiled.__m1087INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:38.214 : INFO: instruction count: 22: __C1075Compiled.__m1088SKIP_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END 2023-04-22 21:13:38.214 : INFO: instruction count: 16: __C1075Compiled.__m1089SKIP_r_struct_of_r_binaryANDr_array_of_r_float64END 2023-04-22 21:13:38.214 : INFO: instruction count: 7: __C1075Compiled.__m1090SKIP_r_binary 2023-04-22 21:13:38.214 : INFO: instruction count: 22: __C1075Compiled.__m1091SKIP_r_array_of_r_float64 2023-04-22 21:13:38.214 : INFO: instruction count: 3: __C1075Compiled.__m1092SKIP_r_float64 2023-04-22 21:13:38.214 : INFO: instruction count: 12: __C1075Compiled.__m1095setup_jab 2023-04-22 21:13:38.215 : INFO: instruction count: 35: __C1075Compiled.__m1100arrayref_bounds_check 2023-04-22 21:13:38.215 : INFO: instruction count: 9: __C1075Compiled.setPartitionIndex 2023-04-22 21:13:38.215 : INFO: instruction count: 4: __C1075Compiled.addPartitionRegion 2023-04-22 21:13:38.215 : INFO: instruction count: 4: __C1075Compiled.setPool 2023-04-22 21:13:38.215 : INFO: instruction count: 3: __C1075Compiled.addHailClassLoader 2023-04-22 21:13:38.215 : INFO: instruction count: 3: __C1075Compiled.addFS 2023-04-22 21:13:38.215 : INFO: instruction count: 4: __C1075Compiled.addTaskContext 2023-04-22 21:13:38.215 : INFO: instruction count: 41: __C1075Compiled.addAndDecodeLiterals 2023-04-22 21:13:38.215 : INFO: instruction count: 27: __C1075Compiled.__m1105DECODE_r_struct_of_r_struct_of_r_int64ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:38.215 : INFO: instruction count: 26: __C1075Compiled.__m1106INPLACE_DECODE_r_struct_of_r_int64ANDr_binaryEND_TO_r_struct_of_r_int64ANDr_stringEND 2023-04-22 21:13:38.215 : INFO: instruction count: 10: __C1075Compiled.__m1107INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:13:38.215 : INFO: instruction count: 31: __C1075Compiled.__m1108INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:38.272 : INFO: encoder cache miss (12 hits, 12 misses, 0.500) 2023-04-22 21:13:38.273 : INFO: instruction count: 3: __C1109HailClassLoaderContainer. 2023-04-22 21:13:38.273 : INFO: instruction count: 3: __C1109HailClassLoaderContainer. 2023-04-22 21:13:38.273 : INFO: instruction count: 3: __C1111FSContainer. 2023-04-22 21:13:38.274 : INFO: instruction count: 3: __C1111FSContainer. 2023-04-22 21:13:38.274 : INFO: instruction count: 3: __C1113etypeEncode. 2023-04-22 21:13:38.274 : INFO: instruction count: 7: __C1113etypeEncode.apply 2023-04-22 21:13:38.274 : INFO: instruction count: 25: __C1113etypeEncode.__m1115ENCODE_SBaseStructPointer_TO_r_struct_of_r_array_of_r_float64END 2023-04-22 21:13:38.274 : INFO: instruction count: 39: __C1113etypeEncode.__m1116ENCODE_SIndexablePointer_TO_r_array_of_r_float64 2023-04-22 21:13:38.274 : INFO: instruction count: 4: __C1113etypeEncode.__m1117ENCODE_SFloat64$_TO_r_float64 2023-04-22 21:13:38.277 : INFO: took 402.313ms 2023-04-22 21:13:38.277 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.278 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.278 : INFO: initial IR: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.278 : INFO: after optimize: relationalLowerer, initial IR: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.278 : INFO: after LowerMatrixToTable: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.278 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.278 : INFO: after LiftRelationalValuesToRelationalLets: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.278 : INFO: after EvalRelationalLets: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.278 : INFO: after LowerAndExecuteShuffles: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.278 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.279 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.280 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{eigenvalues:Array[Float64]}) 2023-04-22 21:13:38.281 : INFO: initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.281 : INFO: after optimize: compileLowerer, initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.282 : INFO: after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.282 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.283 : INFO: after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.283 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.286 : INFO: encoder cache miss (12 hits, 13 misses, 0.480) 2023-04-22 21:13:38.287 : INFO: instruction count: 3: __C1132HailClassLoaderContainer. 2023-04-22 21:13:38.287 : INFO: instruction count: 3: __C1132HailClassLoaderContainer. 2023-04-22 21:13:38.287 : INFO: instruction count: 3: __C1134FSContainer. 2023-04-22 21:13:38.287 : INFO: instruction count: 3: __C1134FSContainer. 2023-04-22 21:13:38.287 : INFO: instruction count: 3: __C1136etypeEncode. 2023-04-22 21:13:38.287 : INFO: instruction count: 7: __C1136etypeEncode.apply 2023-04-22 21:13:38.287 : INFO: instruction count: 1: __C1136etypeEncode.__m1138ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:13:38.289 MemoryStore: INFO: Block broadcast_149 stored as values in memory (estimated size 248.0 B, free 25.1 GiB) 2023-04-22 21:13:38.292 MemoryStore: INFO: Block broadcast_149_piece0 stored as bytes in memory (estimated size 152.0 B, free 25.1 GiB) 2023-04-22 21:13:38.292 BlockManagerInfo: INFO: Added broadcast_149_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 152.0 B, free: 25.3 GiB) 2023-04-22 21:13:38.293 SparkContext: INFO: Created broadcast 149 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:38.293 : INFO: instruction count: 3: __C1118HailClassLoaderContainer. 2023-04-22 21:13:38.293 : INFO: instruction count: 3: __C1118HailClassLoaderContainer. 2023-04-22 21:13:38.293 : INFO: instruction count: 3: __C1120FSContainer. 2023-04-22 21:13:38.293 : INFO: instruction count: 3: __C1120FSContainer. 2023-04-22 21:13:38.294 : INFO: instruction count: 3: __C1122Compiled. 2023-04-22 21:13:38.294 : INFO: instruction count: 27: __C1122Compiled.apply 2023-04-22 21:13:38.294 : INFO: instruction count: 9: __C1122Compiled.setPartitionIndex 2023-04-22 21:13:38.294 : INFO: instruction count: 4: __C1122Compiled.addPartitionRegion 2023-04-22 21:13:38.294 : INFO: instruction count: 4: __C1122Compiled.setPool 2023-04-22 21:13:38.294 : INFO: instruction count: 3: __C1122Compiled.addHailClassLoader 2023-04-22 21:13:38.294 : INFO: instruction count: 3: __C1122Compiled.addFS 2023-04-22 21:13:38.294 : INFO: instruction count: 4: __C1122Compiled.addTaskContext 2023-04-22 21:13:38.294 : INFO: instruction count: 64: __C1122Compiled.addAndDecodeLiterals 2023-04-22 21:13:38.295 : INFO: instruction count: 18: __C1122Compiled.__m1128DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:38.295 : INFO: instruction count: 27: __C1122Compiled.__m1129DECODE_r_struct_of_r_array_of_r_float64END_TO_SBaseStructPointer 2023-04-22 21:13:38.295 : INFO: instruction count: 58: __C1122Compiled.__m1130INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:38.295 : INFO: instruction count: 10: __C1122Compiled.__m1131INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:38.295 : INFO: initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.295 : INFO: after optimize: compileLowerer, initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.296 : INFO: after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.296 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.297 : INFO: after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.297 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.301 : INFO: encoder cache hit 2023-04-22 21:13:38.302 MemoryStore: INFO: Block broadcast_150 stored as values in memory (estimated size 248.0 B, free 25.1 GiB) 2023-04-22 21:13:38.303 MemoryStore: INFO: Block broadcast_150_piece0 stored as bytes in memory (estimated size 152.0 B, free 25.1 GiB) 2023-04-22 21:13:38.303 BlockManagerInfo: INFO: Added broadcast_150_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 152.0 B, free: 25.3 GiB) 2023-04-22 21:13:38.303 SparkContext: INFO: Created broadcast 150 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:38.304 : INFO: instruction count: 3: __C1139HailClassLoaderContainer. 2023-04-22 21:13:38.304 : INFO: instruction count: 3: __C1139HailClassLoaderContainer. 2023-04-22 21:13:38.304 : INFO: instruction count: 3: __C1141FSContainer. 2023-04-22 21:13:38.304 : INFO: instruction count: 3: __C1141FSContainer. 2023-04-22 21:13:38.305 : INFO: instruction count: 3: __C1143Compiled. 2023-04-22 21:13:38.305 : INFO: instruction count: 27: __C1143Compiled.apply 2023-04-22 21:13:38.305 : INFO: instruction count: 9: __C1143Compiled.setPartitionIndex 2023-04-22 21:13:38.305 : INFO: instruction count: 4: __C1143Compiled.addPartitionRegion 2023-04-22 21:13:38.305 : INFO: instruction count: 4: __C1143Compiled.setPool 2023-04-22 21:13:38.305 : INFO: instruction count: 3: __C1143Compiled.addHailClassLoader 2023-04-22 21:13:38.305 : INFO: instruction count: 3: __C1143Compiled.addFS 2023-04-22 21:13:38.305 : INFO: instruction count: 4: __C1143Compiled.addTaskContext 2023-04-22 21:13:38.305 : INFO: instruction count: 64: __C1143Compiled.addAndDecodeLiterals 2023-04-22 21:13:38.305 : INFO: instruction count: 18: __C1143Compiled.__m1149DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:38.305 : INFO: instruction count: 27: __C1143Compiled.__m1150DECODE_r_struct_of_r_array_of_r_float64END_TO_SBaseStructPointer 2023-04-22 21:13:38.305 : INFO: instruction count: 58: __C1143Compiled.__m1151INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:38.305 : INFO: instruction count: 10: __C1143Compiled.__m1152INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:38.306 : INFO: initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.306 : INFO: after optimize: compileLowerer, initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.307 : INFO: after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.307 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.307 : INFO: after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.310 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{eigenvalues:Array[Float64]})) 2023-04-22 21:13:38.324 : INFO: encoder cache hit 2023-04-22 21:13:38.324 MemoryStore: INFO: Block broadcast_151 stored as values in memory (estimated size 248.0 B, free 25.1 GiB) 2023-04-22 21:13:38.326 MemoryStore: INFO: Block broadcast_151_piece0 stored as bytes in memory (estimated size 152.0 B, free 25.1 GiB) 2023-04-22 21:13:38.326 BlockManagerInfo: INFO: Added broadcast_151_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 152.0 B, free: 25.3 GiB) 2023-04-22 21:13:38.326 SparkContext: INFO: Created broadcast 151 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:38.326 : INFO: instruction count: 3: __C1153HailClassLoaderContainer. 2023-04-22 21:13:38.327 : INFO: instruction count: 3: __C1153HailClassLoaderContainer. 2023-04-22 21:13:38.327 : INFO: instruction count: 3: __C1155FSContainer. 2023-04-22 21:13:38.327 : INFO: instruction count: 3: __C1155FSContainer. 2023-04-22 21:13:38.328 : INFO: instruction count: 3: __C1157Compiled. 2023-04-22 21:13:38.328 : INFO: instruction count: 27: __C1157Compiled.apply 2023-04-22 21:13:38.328 : INFO: instruction count: 9: __C1157Compiled.setPartitionIndex 2023-04-22 21:13:38.328 : INFO: instruction count: 4: __C1157Compiled.addPartitionRegion 2023-04-22 21:13:38.328 : INFO: instruction count: 4: __C1157Compiled.setPool 2023-04-22 21:13:38.328 : INFO: instruction count: 3: __C1157Compiled.addHailClassLoader 2023-04-22 21:13:38.328 : INFO: instruction count: 3: __C1157Compiled.addFS 2023-04-22 21:13:38.328 : INFO: instruction count: 4: __C1157Compiled.addTaskContext 2023-04-22 21:13:38.328 : INFO: instruction count: 64: __C1157Compiled.addAndDecodeLiterals 2023-04-22 21:13:38.328 : INFO: instruction count: 18: __C1157Compiled.__m1163DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:38.328 : INFO: instruction count: 27: __C1157Compiled.__m1164DECODE_r_struct_of_r_array_of_r_float64END_TO_SBaseStructPointer 2023-04-22 21:13:38.328 : INFO: instruction count: 58: __C1157Compiled.__m1165INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:38.328 : INFO: instruction count: 10: __C1157Compiled.__m1166INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:38.330 : INFO: encoder cache hit 2023-04-22 21:13:38.331 : INFO: after EvalRelationalLets: IR size 3: (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]}))) 2023-04-22 21:13:38.331 : INFO: after LowerAndExecuteShuffles: IR size 3: (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]}))) 2023-04-22 21:13:38.332 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 3: (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]}))) 2023-04-22 21:13:38.333 : INFO: after LowerOrInterpretNonCompilable: IR size 3: (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]}))) 2023-04-22 21:13:38.334 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 3: (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]}))) 2023-04-22 21:13:38.335 : INFO: initial IR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.335 : INFO: after optimize: compileLowerer, initial IR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.371 : INFO: after InlineApplyIR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.372 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.372 : INFO: after LowerArrayAggsToRunAggs: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.373 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.376 : INFO: encoder cache hit 2023-04-22 21:13:38.377 MemoryStore: INFO: Block broadcast_152 stored as values in memory (estimated size 248.0 B, free 25.1 GiB) 2023-04-22 21:13:38.378 MemoryStore: INFO: Block broadcast_152_piece0 stored as bytes in memory (estimated size 152.0 B, free 25.1 GiB) 2023-04-22 21:13:38.380 BlockManagerInfo: INFO: Added broadcast_152_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 152.0 B, free: 25.3 GiB) 2023-04-22 21:13:38.381 SparkContext: INFO: Created broadcast 152 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:38.381 : INFO: instruction count: 3: __C1167HailClassLoaderContainer. 2023-04-22 21:13:38.381 : INFO: instruction count: 3: __C1167HailClassLoaderContainer. 2023-04-22 21:13:38.381 : INFO: instruction count: 3: __C1169FSContainer. 2023-04-22 21:13:38.381 : INFO: instruction count: 3: __C1169FSContainer. 2023-04-22 21:13:38.382 : INFO: instruction count: 3: __C1171Compiled. 2023-04-22 21:13:38.382 : INFO: instruction count: 56: __C1171Compiled.apply 2023-04-22 21:13:38.382 : INFO: instruction count: 9: __C1171Compiled.setPartitionIndex 2023-04-22 21:13:38.382 : INFO: instruction count: 4: __C1171Compiled.addPartitionRegion 2023-04-22 21:13:38.382 : INFO: instruction count: 4: __C1171Compiled.setPool 2023-04-22 21:13:38.382 : INFO: instruction count: 3: __C1171Compiled.addHailClassLoader 2023-04-22 21:13:38.382 : INFO: instruction count: 3: __C1171Compiled.addFS 2023-04-22 21:13:38.382 : INFO: instruction count: 4: __C1171Compiled.addTaskContext 2023-04-22 21:13:38.383 : INFO: instruction count: 64: __C1171Compiled.addAndDecodeLiterals 2023-04-22 21:13:38.383 : INFO: instruction count: 18: __C1171Compiled.__m1177DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:38.383 : INFO: instruction count: 27: __C1171Compiled.__m1178DECODE_r_struct_of_r_array_of_r_float64END_TO_SBaseStructPointer 2023-04-22 21:13:38.383 : INFO: instruction count: 58: __C1171Compiled.__m1179INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:38.383 : INFO: instruction count: 10: __C1171Compiled.__m1180INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:38.383 : INFO: initial IR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.384 : INFO: after optimize: compileLowerer, initial IR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.384 : INFO: after InlineApplyIR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.385 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.386 : INFO: after LowerArrayAggsToRunAggs: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.386 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.391 : INFO: encoder cache hit 2023-04-22 21:13:38.391 MemoryStore: INFO: Block broadcast_153 stored as values in memory (estimated size 248.0 B, free 25.1 GiB) 2023-04-22 21:13:38.392 MemoryStore: INFO: Block broadcast_153_piece0 stored as bytes in memory (estimated size 152.0 B, free 25.1 GiB) 2023-04-22 21:13:38.392 BlockManagerInfo: INFO: Added broadcast_153_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 152.0 B, free: 25.3 GiB) 2023-04-22 21:13:38.409 SparkContext: INFO: Created broadcast 153 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:38.409 : INFO: instruction count: 3: __C1181HailClassLoaderContainer. 2023-04-22 21:13:38.409 : INFO: instruction count: 3: __C1181HailClassLoaderContainer. 2023-04-22 21:13:38.409 : INFO: instruction count: 3: __C1183FSContainer. 2023-04-22 21:13:38.409 : INFO: instruction count: 3: __C1183FSContainer. 2023-04-22 21:13:38.411 : INFO: instruction count: 3: __C1185Compiled. 2023-04-22 21:13:38.411 : INFO: instruction count: 56: __C1185Compiled.apply 2023-04-22 21:13:38.411 : INFO: instruction count: 9: __C1185Compiled.setPartitionIndex 2023-04-22 21:13:38.411 : INFO: instruction count: 4: __C1185Compiled.addPartitionRegion 2023-04-22 21:13:38.411 : INFO: instruction count: 4: __C1185Compiled.setPool 2023-04-22 21:13:38.411 : INFO: instruction count: 3: __C1185Compiled.addHailClassLoader 2023-04-22 21:13:38.411 : INFO: instruction count: 3: __C1185Compiled.addFS 2023-04-22 21:13:38.411 : INFO: instruction count: 4: __C1185Compiled.addTaskContext 2023-04-22 21:13:38.411 : INFO: instruction count: 64: __C1185Compiled.addAndDecodeLiterals 2023-04-22 21:13:38.411 : INFO: instruction count: 18: __C1185Compiled.__m1191DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:38.411 : INFO: instruction count: 27: __C1185Compiled.__m1192DECODE_r_struct_of_r_array_of_r_float64END_TO_SBaseStructPointer 2023-04-22 21:13:38.411 : INFO: instruction count: 58: __C1185Compiled.__m1193INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:38.411 : INFO: instruction count: 10: __C1185Compiled.__m1194INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:38.412 : INFO: initial IR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.412 : INFO: after optimize: compileLowerer, initial IR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.413 : INFO: after InlineApplyIR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.413 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.414 : INFO: after LowerArrayAggsToRunAggs: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.415 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 4: (MakeTuple (0) (MakeTuple (0) (GetField eigenvalues (EncodedLiteral Struct{eigenvalues:Array[Float64]})))) 2023-04-22 21:13:38.418 : INFO: encoder cache hit 2023-04-22 21:13:38.418 MemoryStore: INFO: Block broadcast_154 stored as values in memory (estimated size 248.0 B, free 25.1 GiB) 2023-04-22 21:13:38.419 MemoryStore: INFO: Block broadcast_154_piece0 stored as bytes in memory (estimated size 152.0 B, free 25.1 GiB) 2023-04-22 21:13:38.421 BlockManagerInfo: INFO: Added broadcast_154_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 152.0 B, free: 25.3 GiB) 2023-04-22 21:13:38.422 SparkContext: INFO: Created broadcast 154 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:38.422 : INFO: instruction count: 3: __C1195HailClassLoaderContainer. 2023-04-22 21:13:38.422 : INFO: instruction count: 3: __C1195HailClassLoaderContainer. 2023-04-22 21:13:38.422 : INFO: instruction count: 3: __C1197FSContainer. 2023-04-22 21:13:38.422 : INFO: instruction count: 3: __C1197FSContainer. 2023-04-22 21:13:38.423 : INFO: instruction count: 3: __C1199Compiled. 2023-04-22 21:13:38.423 : INFO: instruction count: 56: __C1199Compiled.apply 2023-04-22 21:13:38.424 : INFO: instruction count: 9: __C1199Compiled.setPartitionIndex 2023-04-22 21:13:38.424 : INFO: instruction count: 4: __C1199Compiled.addPartitionRegion 2023-04-22 21:13:38.424 : INFO: instruction count: 4: __C1199Compiled.setPool 2023-04-22 21:13:38.424 : INFO: instruction count: 3: __C1199Compiled.addHailClassLoader 2023-04-22 21:13:38.424 : INFO: instruction count: 3: __C1199Compiled.addFS 2023-04-22 21:13:38.424 : INFO: instruction count: 4: __C1199Compiled.addTaskContext 2023-04-22 21:13:38.424 : INFO: instruction count: 64: __C1199Compiled.addAndDecodeLiterals 2023-04-22 21:13:38.424 : INFO: instruction count: 18: __C1199Compiled.__m1205DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:38.424 : INFO: instruction count: 27: __C1199Compiled.__m1206DECODE_r_struct_of_r_array_of_r_float64END_TO_SBaseStructPointer 2023-04-22 21:13:38.424 : INFO: instruction count: 58: __C1199Compiled.__m1207INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:38.424 : INFO: instruction count: 10: __C1199Compiled.__m1208INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:38.426 : INFO: encoder cache miss (18 hits, 14 misses, 0.563) 2023-04-22 21:13:38.429 : INFO: instruction count: 3: __C1209HailClassLoaderContainer. 2023-04-22 21:13:38.429 : INFO: instruction count: 3: __C1209HailClassLoaderContainer. 2023-04-22 21:13:38.429 : INFO: instruction count: 3: __C1211FSContainer. 2023-04-22 21:13:38.429 : INFO: instruction count: 3: __C1211FSContainer. 2023-04-22 21:13:38.443 : INFO: instruction count: 3: __C1213etypeEncode. 2023-04-22 21:13:38.443 : INFO: instruction count: 7: __C1213etypeEncode.apply 2023-04-22 21:13:38.443 : INFO: instruction count: 33: __C1213etypeEncode.__m1215ENCODE_SBaseStructPointer_TO_o_struct_of_o_array_of_o_float64END 2023-04-22 21:13:38.443 : INFO: instruction count: 78: __C1213etypeEncode.__m1216ENCODE_SIndexablePointer_TO_o_array_of_o_float64 2023-04-22 21:13:38.443 : INFO: instruction count: 4: __C1213etypeEncode.__m1217ENCODE_SFloat64$_TO_o_float64 2023-04-22 21:13:38.444 : INFO: finished execution of query hail_query_4, result size is 87.00 B 2023-04-22 21:13:38.444 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:38.445 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:13:38.445 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:38.445 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode total 578.058ms self 24.668ms children 553.390ms %children 95.73% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR total 2.547ms self 0.007ms children 2.539ms %children 99.72% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation total 2.505ms self 0.034ms children 2.470ms %children 98.63% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 2.470ms self 0.088ms children 2.383ms %children 96.46% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.114ms self 0.114ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.139ms self 0.139ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.235ms self 0.235ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.212ms self 0.212ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.340ms self 0.340ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.525ms self 0.525ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.078ms self 0.078ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.044ms self 0.044ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.097ms self 0.097ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.171ms self 0.171ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.047ms self 0.047ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.259ms self 0.259ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable total 0.051ms self 0.004ms children 0.047ms %children 92.14% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/LoweringTransformation total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable total 0.788ms self 0.005ms children 0.783ms %children 99.42% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 0.758ms self 0.027ms children 0.730ms %children 96.38% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 0.730ms self 0.024ms children 0.707ms %children 96.78% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.088ms self 0.088ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.090ms self 0.090ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.175ms self 0.175ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.247ms self 0.247ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets total 0.123ms self 0.004ms children 0.119ms %children 96.90% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.445 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets total 457.095ms self 0.007ms children 457.088ms %children 100.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation total 457.017ms self 1.791ms children 455.226ms %children 99.61% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles total 0.038ms self 0.003ms children 0.035ms %children 91.69% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.551ms self 0.004ms children 0.547ms %children 99.23% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.536ms self 0.012ms children 0.524ms %children 97.69% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.524ms self 0.019ms children 0.505ms %children 96.44% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.085ms self 0.085ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.043ms self 0.043ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.141ms self 0.141ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.162ms self 0.162ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable total 402.558ms self 0.008ms children 402.550ms %children 100.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 402.527ms self 96.606ms children 305.921ms %children 76.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR total 1.925ms self 0.006ms children 1.918ms %children 99.67% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation total 1.888ms self 0.020ms children 1.868ms %children 98.96% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 1.868ms self 0.052ms children 1.816ms %children 97.22% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.271ms self 0.271ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.142ms self 0.142ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.090ms self 0.090ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.355ms self 0.355ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.218ms self 0.218ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.050ms self 0.050ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.099ms self 0.099ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.166ms self 0.166ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.186ms self 0.186ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable total 0.058ms self 0.004ms children 0.054ms %children 92.36% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/LoweringTransformation total 0.044ms self 0.044ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable total 0.733ms self 0.005ms children 0.728ms %children 99.32% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 0.709ms self 0.016ms children 0.693ms %children 97.80% 2023-04-22 21:13:38.446 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 0.693ms self 0.015ms children 0.678ms %children 97.77% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.115ms self 0.115ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.172ms self 0.172ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.207ms self 0.207ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets total 0.052ms self 0.005ms children 0.047ms %children 90.60% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets total 0.035ms self 0.004ms children 0.031ms %children 88.21% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/LoweringTransformation total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles total 0.039ms self 0.004ms children 0.035ms %children 90.17% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.700ms self 0.005ms children 0.694ms %children 99.26% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.674ms self 0.015ms children 0.660ms %children 97.79% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.660ms self 0.015ms children 0.645ms %children 97.79% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.108ms self 0.108ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.172ms self 0.172ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.180ms self 0.180ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable total 0.071ms self 0.004ms children 0.067ms %children 94.43% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 0.052ms self 0.052ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.693ms self 0.005ms children 0.688ms %children 99.27% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.669ms self 0.014ms children 0.655ms %children 97.93% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.655ms self 0.014ms children 0.641ms %children 97.90% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.115ms self 0.115ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.170ms self 0.170ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.179ms self 0.179ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.447 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile total 245.641ms self 228.095ms children 17.547ms %children 7.14% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.916ms self 0.005ms children 0.912ms %children 99.50% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.873ms self 0.023ms children 0.850ms %children 97.39% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.850ms self 0.019ms children 0.831ms %children 97.80% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.049ms self 0.049ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.091ms self 0.091ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.212ms self 0.212ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.237ms self 0.237ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.039ms self 0.005ms children 0.035ms %children 88.40% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 1.124ms self 0.006ms children 1.119ms %children 99.49% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 1.087ms self 0.029ms children 1.057ms %children 97.33% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 1.057ms self 0.018ms children 1.039ms %children 98.27% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.093ms self 0.093ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.227ms self 0.227ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.146ms self 0.146ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.232ms self 0.232ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.240ms self 0.240ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.481ms self 0.005ms children 0.476ms %children 98.90% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.453ms self 0.453ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.801ms self 0.005ms children 0.795ms %children 99.32% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.775ms self 0.024ms children 0.751ms %children 96.89% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.751ms self 0.016ms children 0.735ms %children 97.82% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.130ms self 0.130ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.186ms self 0.186ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.228ms self 0.228ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 0.376ms self 0.376ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.951ms self 0.006ms children 0.945ms %children 99.34% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.921ms self 0.025ms children 0.896ms %children 97.32% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.896ms self 0.025ms children 0.871ms %children 97.22% 2023-04-22 21:13:38.448 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.053ms self 0.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.109ms self 0.109ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.138ms self 0.138ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.083ms self 0.083ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.198ms self 0.198ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.262ms self 0.262ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.038ms self 0.004ms children 0.033ms %children 88.38% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.810ms self 0.005ms children 0.805ms %children 99.37% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.784ms self 0.021ms children 0.763ms %children 97.27% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.763ms self 0.017ms children 0.746ms %children 97.76% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.077ms self 0.077ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.114ms self 0.114ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.067ms self 0.067ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.186ms self 0.186ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.242ms self 0.242ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.434ms self 0.005ms children 0.429ms %children 98.74% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.402ms self 0.402ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.801ms self 0.005ms children 0.796ms %children 99.35% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.776ms self 0.021ms children 0.755ms %children 97.35% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.755ms self 0.017ms children 0.739ms %children 97.78% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.077ms self 0.077ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.115ms self 0.115ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.075ms self 0.075ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.189ms self 0.189ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.229ms self 0.229ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 0.339ms self 0.339ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.917ms self 0.005ms children 0.912ms %children 99.44% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.889ms self 0.022ms children 0.866ms %children 97.49% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.866ms self 0.018ms children 0.848ms %children 97.94% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.053ms self 0.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.096ms self 0.096ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.143ms self 0.143ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.084ms self 0.084ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.449 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.200ms self 0.200ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.248ms self 0.248ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.036ms self 0.004ms children 0.032ms %children 87.71% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.799ms self 0.005ms children 0.793ms %children 99.36% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.773ms self 0.022ms children 0.752ms %children 97.21% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.752ms self 0.017ms children 0.735ms %children 97.74% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.113ms self 0.113ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.200ms self 0.200ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.227ms self 0.227ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.332ms self 0.005ms children 0.327ms %children 98.57% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.307ms self 0.307ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 8.024ms self 0.005ms children 8.019ms %children 99.94% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 7.997ms self 0.021ms children 7.976ms %children 99.73% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 7.976ms self 0.018ms children 7.958ms %children 99.77% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.114ms self 0.114ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 7.407ms self 7.407ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.243ms self 0.243ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 0.327ms self 0.327ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/InitializeCompiledFunction total 2.150ms self 2.150ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/RunCompiledFunction total 53.825ms self 53.825ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.245ms self 0.005ms children 0.240ms %children 98.01% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.234ms self 0.011ms children 0.223ms %children 95.28% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.223ms self 0.026ms children 0.197ms %children 88.31% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.450 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR total 0.166ms self 0.003ms children 0.163ms %children 97.92% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation total 0.159ms self 0.008ms children 0.151ms %children 95.26% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 0.151ms self 0.009ms children 0.142ms %children 94.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable total 0.032ms self 0.003ms children 0.028ms %children 90.07% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable/LoweringTransformation total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable total 0.155ms self 0.003ms children 0.152ms %children 98.12% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 0.149ms self 0.007ms children 0.141ms %children 95.14% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 0.141ms self 0.009ms children 0.133ms %children 93.88% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.053ms self 0.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets total 0.025ms self 0.003ms children 0.022ms %children 86.74% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets total 0.014ms self 0.003ms children 0.012ms %children 82.38% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets/LoweringTransformation total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles total 0.017ms self 0.002ms children 0.015ms %children 86.53% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.152ms self 0.003ms children 0.149ms %children 98.21% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.451 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.146ms self 0.007ms children 0.139ms %children 95.12% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.139ms self 0.009ms children 0.130ms %children 93.86% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable total 0.049ms self 0.003ms children 0.047ms %children 94.25% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.452 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 1.397ms self 0.005ms children 1.392ms %children 99.62% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 1.385ms self 0.010ms children 1.375ms %children 99.31% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 1.375ms self 0.012ms children 1.364ms %children 99.15% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.052ms self 0.052ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 1.262ms self 1.262ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile total 48.390ms self 42.326ms children 6.064ms %children 12.53% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.385ms self 0.005ms children 0.380ms %children 98.81% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.369ms self 0.015ms children 0.353ms %children 95.88% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.353ms self 0.014ms children 0.339ms %children 95.95% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.045ms self 0.045ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.105ms self 0.105ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.082ms self 0.082ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR total 0.023ms self 0.004ms children 0.020ms %children 84.26% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.343ms self 0.004ms children 0.339ms %children 98.79% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.465 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.329ms self 0.015ms children 0.314ms %children 95.49% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.314ms self 0.013ms children 0.301ms %children 95.90% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.095ms self 0.095ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.077ms self 0.077ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.274ms self 0.005ms children 0.270ms %children 98.35% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.258ms self 0.258ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.356ms self 0.005ms children 0.351ms %children 98.68% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.341ms self 0.015ms children 0.326ms %children 95.68% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.326ms self 0.013ms children 0.313ms %children 95.92% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.095ms self 0.095ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.079ms self 0.079ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/EmitContext.analyze total 0.213ms self 0.213ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.385ms self 0.004ms children 0.381ms %children 98.90% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.369ms self 0.016ms children 0.353ms %children 95.61% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.353ms self 0.014ms children 0.339ms %children 96.02% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.046ms self 0.046ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR total 0.023ms self 0.004ms children 0.019ms %children 83.69% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.358ms self 0.012ms children 0.345ms %children 96.60% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.335ms self 0.015ms children 0.320ms %children 95.44% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.320ms self 0.013ms children 0.307ms %children 95.92% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.466 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.099ms self 0.099ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.241ms self 0.004ms children 0.237ms %children 98.16% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.226ms self 0.226ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.345ms self 0.004ms children 0.340ms %children 98.77% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.330ms self 0.015ms children 0.315ms %children 95.35% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.315ms self 0.014ms children 0.301ms %children 95.63% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.049ms self 0.049ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.093ms self 0.093ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.077ms self 0.077ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/EmitContext.analyze total 0.196ms self 0.196ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.420ms self 0.004ms children 0.416ms %children 98.96% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.404ms self 0.016ms children 0.387ms %children 95.96% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.387ms self 0.014ms children 0.373ms %children 96.34% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.098ms self 0.098ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.128ms self 0.128ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR total 0.022ms self 0.004ms children 0.019ms %children 83.92% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.339ms self 0.004ms children 0.335ms %children 98.80% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.324ms self 0.014ms children 0.310ms %children 95.57% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.310ms self 0.013ms children 0.297ms %children 95.70% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.092ms self 0.092ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.076ms self 0.076ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.233ms self 0.005ms children 0.229ms %children 98.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.218ms self 0.218ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.467 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 1.712ms self 0.005ms children 1.707ms %children 99.71% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 1.694ms self 0.030ms children 1.664ms %children 98.21% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 1.664ms self 0.014ms children 1.650ms %children 99.13% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.042ms self 0.042ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 1.441ms self 1.441ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.080ms self 0.080ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/EmitContext.analyze total 0.196ms self 0.196ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/InitializeCompiledFunction total 1.431ms self 1.431ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/RunCompiledFunction total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles total 0.031ms self 0.004ms children 0.027ms %children 87.27% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/LoweringTransformation total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.913ms self 0.005ms children 0.908ms %children 99.44% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.890ms self 0.022ms children 0.867ms %children 97.51% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.867ms self 0.038ms children 0.829ms %children 95.58% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.046ms self 0.046ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.487ms self 0.487ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.128ms self 0.128ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable total 0.075ms self 0.004ms children 0.071ms %children 95.01% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.499ms self 0.005ms children 0.493ms %children 98.95% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.479ms self 0.019ms children 0.460ms %children 96.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.460ms self 0.015ms children 0.444ms %children 96.65% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.083ms self 0.083ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.120ms self 0.120ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.117ms self 0.117ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/Compile total 89.844ms self 47.960ms children 41.884ms %children 46.62% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.549ms self 0.005ms children 0.545ms %children 99.17% 2023-04-22 21:13:38.468 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.529ms self 0.022ms children 0.506ms %children 95.79% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.506ms self 0.018ms children 0.489ms %children 96.53% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.068ms self 0.068ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.049ms self 0.049ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.142ms self 0.142ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 35.175ms self 0.007ms children 35.168ms %children 99.98% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 35.150ms self 35.150ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.554ms self 0.005ms children 0.549ms %children 99.16% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.533ms self 0.027ms children 0.506ms %children 94.89% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.506ms self 0.020ms children 0.486ms %children 96.01% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.068ms self 0.068ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.050ms self 0.050ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.145ms self 0.145ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.141ms self 0.141ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.319ms self 0.004ms children 0.314ms %children 98.59% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.300ms self 0.300ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.549ms self 0.005ms children 0.545ms %children 99.18% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.529ms self 0.022ms children 0.507ms %children 95.76% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.507ms self 0.016ms children 0.491ms %children 96.76% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.067ms self 0.067ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.048ms self 0.048ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.157ms self 0.157ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.139ms self 0.139ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.263ms self 0.263ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.567ms self 0.004ms children 0.562ms %children 99.26% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.543ms self 0.023ms children 0.519ms %children 95.74% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.519ms self 0.017ms children 0.502ms %children 96.68% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.047ms self 0.047ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.469 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.052ms self 0.052ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.141ms self 0.141ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.028ms self 0.004ms children 0.024ms %children 85.07% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.566ms self 0.006ms children 0.559ms %children 98.86% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.543ms self 0.023ms children 0.520ms %children 95.79% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.520ms self 0.018ms children 0.502ms %children 96.50% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.069ms self 0.069ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.140ms self 0.140ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.151ms self 0.151ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.308ms self 0.004ms children 0.304ms %children 98.58% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.289ms self 0.289ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.542ms self 0.006ms children 0.536ms %children 98.83% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.517ms self 0.022ms children 0.495ms %children 95.70% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.495ms self 0.016ms children 0.479ms %children 96.73% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.076ms self 0.076ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.049ms self 0.049ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.134ms self 0.134ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.141ms self 0.141ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.249ms self 0.249ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.582ms self 0.004ms children 0.578ms %children 99.26% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.561ms self 0.022ms children 0.539ms %children 96.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.539ms self 0.018ms children 0.521ms %children 96.73% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.085ms self 0.085ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.149ms self 0.149ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.141ms self 0.141ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.028ms self 0.004ms children 0.024ms %children 86.86% 2023-04-22 21:13:38.470 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.549ms self 0.004ms children 0.545ms %children 99.24% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.530ms self 0.023ms children 0.507ms %children 95.74% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.507ms self 0.017ms children 0.491ms %children 96.72% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.048ms self 0.048ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.151ms self 0.151ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.141ms self 0.141ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.298ms self 0.004ms children 0.294ms %children 98.61% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.279ms self 0.279ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.524ms self 0.005ms children 0.519ms %children 99.13% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.504ms self 0.022ms children 0.482ms %children 95.55% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.482ms self 0.016ms children 0.466ms %children 96.62% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.048ms self 0.048ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.140ms self 0.140ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.235ms self 0.235ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/InitializeCompiledFunction total 1.418ms self 1.418ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.471 : INFO: timing SparkBackend.executeEncode/RunCompiledFunction total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.783 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:38.855 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:38.855 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:13:38.855 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:38.855 : INFO: RegionPool: FREE: 64.0K allocated (64.0K blocks / 0 chunks), regions.size = 1, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:38.855 : INFO: timing SparkBackend.parse_value_ir total 72.187ms self 72.187ms children 0.000ms %children 0.00% 2023-04-22 21:13:38.856 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:38.856 : INFO: starting execution of query hail_query_5 of initial size 35 2023-04-22 21:13:38.859 : INFO: initial IR: IR size 35: (Let __rng_state (RNGStateLiteral) (MakeTuple (0) (TableAggregate (MatrixColsTable (MatrixMapCols None (MatrixMapCols () (MatrixMapCols None (MatrixAnnotateColsTable "__uid_4" (MatrixRead Matrix{global:Struct{},col_key:[s],col:Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean},row_key:[[locus,alleles]],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64},entry:Struct{GT:Call}} False False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (TableKeyBy (s) False (TableParallelize None (MakeStruct (rows (GetField scores (TableGetGlobals (TableRead Table{global:Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],loadings:Array[Float64]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))))) (global (MakeStruct)))))) (InsertFields (SelectFields (s) (SelectFields (s fam_id pat_id mat_id is_female is_case) (Ref sa))) None (__scores (GetField scores (GetField __uid_4 (Ref sa)))))) (SelectFields (s __scores) (Ref sa))) (InsertFields (SelectFields (__scores) (SelectFields (s __scores) (Ref sa))) None))) (ApplyAggOp Sum () ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref row))))))))) 2023-04-22 21:13:38.935 : INFO: after optimize: relationalLowerer, initial IR: IR size 24: (MakeTuple (0) (TableAggregate (MatrixColsTable (MatrixMapCols () (MatrixAnnotateColsTable "__uid_4" (MatrixRead Matrix{global:Struct{},col_key:[s],col:Struct{s:String},row_key:[[locus,alleles]],row:Struct{locus:Locus(GRCh38),alleles:Array[String]},entry:Struct{}} False False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (TableKeyBy (s) False (TableParallelize None (MakeStruct (rows (GetField scores (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))))) (global (Literal Struct{} )))))) (InsertFields (SelectFields () (Ref sa)) None (__scores (GetField scores (GetField __uid_4 (Ref sa))))))) (ApplyAggOp Sum () ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref row)))))))) 2023-04-22 21:13:38.967 : INFO: after LowerMatrixToTable: IR size 79: (MakeTuple (0) (TableAggregate (TableKeyBy () False (TableParallelize None (Let __cols_and_globals (TableGetGlobals (TableMapGlobals (TableMapGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __dictfield (ToDict (ToStream False (ToArray (StreamMap __iruid_1001 (ToStream False (GetField rows (TableCollect (TableKeyBy () False (TableKeyBy (s) False (TableParallelize None (MakeStruct (rows (GetField scores (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))))) (global (Literal Struct{} ))))))))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1001)) (SelectFields (scores) (Ref __iruid_1001))))))) (InsertFields (Ref global) None (__cols (ToArray (StreamMap __iruid_1000 (ToStream False (GetField __cols (Ref global))) (InsertFields (Ref __iruid_1000) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __dictfield) (MakeStruct (s (GetField s (Ref __iruid_1000))))))))))))) (InsertFields (Ref global) None (__cols (ToArray (StreamMap __iruid_1005 (ToStream False (ToArray (StreamRange -1 False (I32 0) (ArrayLen (GetField __cols (Ref global))) (I32 1)))) (Let __cols_array (GetField __cols (Ref global)) (Let sa (ArrayRef -1 (Ref __cols_array) (Ref __iruid_1005)) (InsertFields (SelectFields () (Ref sa)) None (__scores (GetField scores (GetField __uid_4 (Ref sa))))))))))))) (MakeStruct (rows (GetField __cols (Ref __cols_and_globals))) (global (SelectFields () (Ref __cols_and_globals))))))) (ApplyAggOp Sum () ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref row)))))))) 2023-04-22 21:13:39.073 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 56: (MakeTuple (0) (TableAggregate (TableParallelize None (Let __iruid_1074 (TableGetGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) (Let __iruid_1075 (ToDict (StreamMap __iruid_1076 (ToStream False (GetField scores (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1076)) (SelectFields (scores) (Ref __iruid_1076))))) (Let __iruid_1077 (ToArray (StreamMap __iruid_1078 (ToStream False (GetField __cols (Ref __iruid_1074))) (InsertFields (Ref __iruid_1078) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1075) (MakeStruct (s (GetField s (Ref __iruid_1078))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_1079 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1077)) (I32 1)) (Let __iruid_1080 (ArrayRef -1 (Ref __iruid_1077) (Ref __iruid_1079)) (InsertFields (SelectFields () (Ref __iruid_1080)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1080))))))))) (global (SelectFields () (Ref __iruid_1074)))))))) (ApplyAggOp Sum () ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref row)))))))) 2023-04-22 21:13:39.075 : INFO: after LiftRelationalValuesToRelationalLets: IR size 62: (RelationalLet __iruid_1091 (TableGetGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) (RelationalLet __iruid_1092 (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) (RelationalLet __iruid_1090 (TableAggregate (TableParallelize None (Let __iruid_1074 (RelationalRef __iruid_1091 Struct{__cols:Array[Struct{s:String}]}) (Let __iruid_1075 (ToDict (StreamMap __iruid_1076 (ToStream False (GetField scores (RelationalRef __iruid_1092 Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1076)) (SelectFields (scores) (Ref __iruid_1076))))) (Let __iruid_1077 (ToArray (StreamMap __iruid_1078 (ToStream False (GetField __cols (Ref __iruid_1074))) (InsertFields (Ref __iruid_1078) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1075) (MakeStruct (s (GetField s (Ref __iruid_1078))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_1079 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1077)) (I32 1)) (Let __iruid_1080 (ArrayRef -1 (Ref __iruid_1077) (Ref __iruid_1079)) (InsertFields (SelectFields () (Ref __iruid_1080)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1080))))))))) (global (SelectFields () (Ref __iruid_1074)))))))) (ApplyAggOp Sum () ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref row))))))) (MakeTuple (0) (RelationalRef __iruid_1090 Int64))))) 2023-04-22 21:13:39.077 : INFO: initial IR: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) 2023-04-22 21:13:39.077 : INFO: after LowerAndExecuteShuffles: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) 2023-04-22 21:13:39.078 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) 2023-04-22 21:13:39.078 : INFO: LowerOrInterpretNonCompilable: whole stage code generation is a go! 2023-04-22 21:13:39.078 : INFO: lowering result: TableGetGlobals 2023-04-22 21:13:39.102 MemoryStore: INFO: Block broadcast_155 stored as values in memory (estimated size 47.3 MiB, free 25.1 GiB) 2023-04-22 21:13:39.845 MemoryStore: INFO: Block broadcast_155_piece0 stored as bytes in memory (estimated size 2.3 MiB, free 25.1 GiB) 2023-04-22 21:13:39.845 BlockManagerInfo: INFO: Added broadcast_155_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 2.3 MiB, free: 25.3 GiB) 2023-04-22 21:13:39.846 SparkContext: INFO: Created broadcast 155 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:39.851 : INFO: compiling and evaluating result: TableGetGlobals 2023-04-22 21:13:39.851 : INFO: initial IR: IR size 3: (Let __iruid_1093 (Literal Struct{__cols:Array[Struct{s:String}]} ) (Ref __iruid_1093)) 2023-04-22 21:13:39.852 : INFO: after optimize: relationalLowerer, initial IR: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:13:39.852 : INFO: after LowerMatrixToTable: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:13:39.852 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:13:39.852 : INFO: after LiftRelationalValuesToRelationalLets: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:13:39.852 : INFO: after EvalRelationalLets: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:13:39.852 : INFO: after LowerAndExecuteShuffles: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:13:39.854 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:13:39.854 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:13:39.854 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:13:39.884 : INFO: initial IR: IR size 2: (MakeTuple (0) (Literal Struct{__cols:Array[Struct{s:String}]} )) 2023-04-22 21:13:39.884 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:39.885 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:39.885 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:39.915 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:39.915 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:39.923 : INFO: encoder cache miss (18 hits, 15 misses, 0.545) 2023-04-22 21:13:39.925 : INFO: instruction count: 3: __C1234HailClassLoaderContainer. 2023-04-22 21:13:39.925 : INFO: instruction count: 3: __C1234HailClassLoaderContainer. 2023-04-22 21:13:39.926 : INFO: instruction count: 3: __C1236FSContainer. 2023-04-22 21:13:39.926 : INFO: instruction count: 3: __C1236FSContainer. 2023-04-22 21:13:39.926 : INFO: instruction count: 3: __C1238etypeEncode. 2023-04-22 21:13:39.926 : INFO: instruction count: 7: __C1238etypeEncode.apply 2023-04-22 21:13:39.927 : INFO: instruction count: 9: __C1238etypeEncode.__m1240ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_array_of_r_struct_of_r_binaryENDENDENDEND 2023-04-22 21:13:39.927 : INFO: instruction count: 9: __C1238etypeEncode.__m1241ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_array_of_r_struct_of_r_binaryENDENDEND 2023-04-22 21:13:39.927 : INFO: instruction count: 25: __C1238etypeEncode.__m1242ENCODE_SBaseStructPointer_TO_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND 2023-04-22 21:13:39.927 : INFO: instruction count: 35: __C1238etypeEncode.__m1243ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_binaryEND 2023-04-22 21:13:39.927 : INFO: instruction count: 13: __C1238etypeEncode.__m1244ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryEND 2023-04-22 21:13:39.927 : INFO: instruction count: 16: __C1238etypeEncode.__m1245ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:13:39.952 MemoryStore: INFO: Block broadcast_156 stored as values in memory (estimated size 47.4 KiB, free 25.1 GiB) 2023-04-22 21:13:39.953 MemoryStore: INFO: Block broadcast_156_piece0 stored as bytes in memory (estimated size 17.5 KiB, free 25.1 GiB) 2023-04-22 21:13:39.954 BlockManagerInfo: INFO: Added broadcast_156_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 17.5 KiB, free: 25.3 GiB) 2023-04-22 21:13:39.955 SparkContext: INFO: Created broadcast 156 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:39.955 : INFO: instruction count: 3: __C1218HailClassLoaderContainer. 2023-04-22 21:13:39.957 : INFO: instruction count: 3: __C1218HailClassLoaderContainer. 2023-04-22 21:13:39.958 : INFO: instruction count: 3: __C1220FSContainer. 2023-04-22 21:13:39.958 : INFO: instruction count: 3: __C1220FSContainer. 2023-04-22 21:13:39.959 : INFO: instruction count: 3: __C1222Compiled. 2023-04-22 21:13:39.959 : INFO: instruction count: 7: __C1222Compiled.apply 2023-04-22 21:13:39.959 : INFO: instruction count: 9: __C1222Compiled.setPartitionIndex 2023-04-22 21:13:39.959 : INFO: instruction count: 4: __C1222Compiled.addPartitionRegion 2023-04-22 21:13:39.959 : INFO: instruction count: 4: __C1222Compiled.setPool 2023-04-22 21:13:39.959 : INFO: instruction count: 3: __C1222Compiled.addHailClassLoader 2023-04-22 21:13:39.959 : INFO: instruction count: 3: __C1222Compiled.addFS 2023-04-22 21:13:39.959 : INFO: instruction count: 4: __C1222Compiled.addTaskContext 2023-04-22 21:13:39.959 : INFO: instruction count: 41: __C1222Compiled.addAndDecodeLiterals 2023-04-22 21:13:39.959 : INFO: instruction count: 27: __C1222Compiled.__m1228DECODE_r_struct_of_r_struct_of_r_struct_of_r_array_of_r_struct_of_r_binaryENDENDENDEND_TO_SBaseStructPointer 2023-04-22 21:13:39.959 : INFO: instruction count: 17: __C1222Compiled.__m1229INPLACE_DECODE_r_struct_of_r_struct_of_r_array_of_r_struct_of_r_binaryENDENDEND_TO_r_tuple_of_r_struct_of_r_array_of_r_struct_of_r_stringENDENDEND 2023-04-22 21:13:39.960 : INFO: instruction count: 17: __C1222Compiled.__m1230INPLACE_DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_r_struct_of_r_array_of_r_struct_of_r_stringENDEND 2023-04-22 21:13:39.960 : INFO: instruction count: 58: __C1222Compiled.__m1231INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:13:39.960 : INFO: instruction count: 17: __C1222Compiled.__m1232INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:13:39.960 : INFO: instruction count: 31: __C1222Compiled.__m1233INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:39.960 : INFO: initial IR: IR size 2: (MakeTuple (0) (Literal Struct{__cols:Array[Struct{s:String}]} )) 2023-04-22 21:13:39.960 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:39.960 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:39.961 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:39.963 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:39.963 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:39.993 : INFO: encoder cache hit 2023-04-22 21:13:39.997 MemoryStore: INFO: Block broadcast_157 stored as values in memory (estimated size 47.4 KiB, free 25.1 GiB) 2023-04-22 21:13:39.998 MemoryStore: INFO: Block broadcast_157_piece0 stored as bytes in memory (estimated size 17.5 KiB, free 25.1 GiB) 2023-04-22 21:13:40.000 BlockManagerInfo: INFO: Added broadcast_157_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 17.5 KiB, free: 25.3 GiB) 2023-04-22 21:13:40.000 SparkContext: INFO: Created broadcast 157 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:40.001 : INFO: instruction count: 3: __C1246HailClassLoaderContainer. 2023-04-22 21:13:40.001 : INFO: instruction count: 3: __C1246HailClassLoaderContainer. 2023-04-22 21:13:40.001 : INFO: instruction count: 3: __C1248FSContainer. 2023-04-22 21:13:40.001 : INFO: instruction count: 3: __C1248FSContainer. 2023-04-22 21:13:40.002 : INFO: instruction count: 3: __C1250Compiled. 2023-04-22 21:13:40.002 : INFO: instruction count: 7: __C1250Compiled.apply 2023-04-22 21:13:40.002 : INFO: instruction count: 9: __C1250Compiled.setPartitionIndex 2023-04-22 21:13:40.002 : INFO: instruction count: 4: __C1250Compiled.addPartitionRegion 2023-04-22 21:13:40.002 : INFO: instruction count: 4: __C1250Compiled.setPool 2023-04-22 21:13:40.002 : INFO: instruction count: 3: __C1250Compiled.addHailClassLoader 2023-04-22 21:13:40.002 : INFO: instruction count: 3: __C1250Compiled.addFS 2023-04-22 21:13:40.002 : INFO: instruction count: 4: __C1250Compiled.addTaskContext 2023-04-22 21:13:40.003 : INFO: instruction count: 41: __C1250Compiled.addAndDecodeLiterals 2023-04-22 21:13:40.003 : INFO: instruction count: 27: __C1250Compiled.__m1256DECODE_r_struct_of_r_struct_of_r_struct_of_r_array_of_r_struct_of_r_binaryENDENDENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.003 : INFO: instruction count: 17: __C1250Compiled.__m1257INPLACE_DECODE_r_struct_of_r_struct_of_r_array_of_r_struct_of_r_binaryENDENDEND_TO_r_tuple_of_r_struct_of_r_array_of_r_struct_of_r_stringENDENDEND 2023-04-22 21:13:40.003 : INFO: instruction count: 17: __C1250Compiled.__m1258INPLACE_DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_r_struct_of_r_array_of_r_struct_of_r_stringENDEND 2023-04-22 21:13:40.003 : INFO: instruction count: 58: __C1250Compiled.__m1259INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:13:40.003 : INFO: instruction count: 17: __C1250Compiled.__m1260INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:13:40.003 : INFO: instruction count: 31: __C1250Compiled.__m1261INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:40.004 : INFO: initial IR: IR size 2: (MakeTuple (0) (Literal Struct{__cols:Array[Struct{s:String}]} )) 2023-04-22 21:13:40.004 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:40.004 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:40.005 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:40.007 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:40.007 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Struct{__cols:Array[Struct{s:String}]}] ) 2023-04-22 21:13:40.026 : INFO: encoder cache hit 2023-04-22 21:13:40.030 MemoryStore: INFO: Block broadcast_158 stored as values in memory (estimated size 47.4 KiB, free 25.1 GiB) 2023-04-22 21:13:40.031 MemoryStore: INFO: Block broadcast_158_piece0 stored as bytes in memory (estimated size 17.5 KiB, free 25.1 GiB) 2023-04-22 21:13:40.044 BlockManagerInfo: INFO: Added broadcast_158_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 17.5 KiB, free: 25.3 GiB) 2023-04-22 21:13:40.044 SparkContext: INFO: Created broadcast 158 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:40.045 : INFO: instruction count: 3: __C1262HailClassLoaderContainer. 2023-04-22 21:13:40.045 : INFO: instruction count: 3: __C1262HailClassLoaderContainer. 2023-04-22 21:13:40.045 : INFO: instruction count: 3: __C1264FSContainer. 2023-04-22 21:13:40.045 : INFO: instruction count: 3: __C1264FSContainer. 2023-04-22 21:13:40.046 : INFO: instruction count: 3: __C1266Compiled. 2023-04-22 21:13:40.046 : INFO: instruction count: 7: __C1266Compiled.apply 2023-04-22 21:13:40.046 : INFO: instruction count: 9: __C1266Compiled.setPartitionIndex 2023-04-22 21:13:40.046 : INFO: instruction count: 4: __C1266Compiled.addPartitionRegion 2023-04-22 21:13:40.046 : INFO: instruction count: 4: __C1266Compiled.setPool 2023-04-22 21:13:40.046 : INFO: instruction count: 3: __C1266Compiled.addHailClassLoader 2023-04-22 21:13:40.046 : INFO: instruction count: 3: __C1266Compiled.addFS 2023-04-22 21:13:40.047 : INFO: instruction count: 4: __C1266Compiled.addTaskContext 2023-04-22 21:13:40.047 : INFO: instruction count: 41: __C1266Compiled.addAndDecodeLiterals 2023-04-22 21:13:40.047 : INFO: instruction count: 27: __C1266Compiled.__m1272DECODE_r_struct_of_r_struct_of_r_struct_of_r_array_of_r_struct_of_r_binaryENDENDENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.047 : INFO: instruction count: 17: __C1266Compiled.__m1273INPLACE_DECODE_r_struct_of_r_struct_of_r_array_of_r_struct_of_r_binaryENDENDEND_TO_r_tuple_of_r_struct_of_r_array_of_r_struct_of_r_stringENDENDEND 2023-04-22 21:13:40.047 : INFO: instruction count: 17: __C1266Compiled.__m1274INPLACE_DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_r_struct_of_r_array_of_r_struct_of_r_stringENDEND 2023-04-22 21:13:40.047 : INFO: instruction count: 58: __C1266Compiled.__m1275INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:13:40.047 : INFO: instruction count: 17: __C1266Compiled.__m1276INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:13:40.047 : INFO: instruction count: 31: __C1266Compiled.__m1277INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:40.051 : INFO: encoder cache hit 2023-04-22 21:13:40.052 : INFO: took 973.263ms 2023-04-22 21:13:40.052 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.055 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.055 : INFO: initial IR: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.055 : INFO: after optimize: relationalLowerer, initial IR: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.060 : INFO: after LowerMatrixToTable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.060 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.060 : INFO: after LiftRelationalValuesToRelationalLets: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.060 : INFO: after EvalRelationalLets: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.060 : INFO: after LowerAndExecuteShuffles: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.061 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.061 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.061 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:13:40.062 : INFO: initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.063 : INFO: after optimize: compileLowerer, initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.063 : INFO: after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.064 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.064 : INFO: after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.065 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.101 : INFO: encoder cache hit 2023-04-22 21:13:40.101 MemoryStore: INFO: Block broadcast_159 stored as values in memory (estimated size 47.5 KiB, free 25.1 GiB) 2023-04-22 21:13:40.103 MemoryStore: INFO: Block broadcast_159_piece0 stored as bytes in memory (estimated size 17.5 KiB, free 25.1 GiB) 2023-04-22 21:13:40.110 BlockManagerInfo: INFO: Added broadcast_159_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 17.5 KiB, free: 25.3 GiB) 2023-04-22 21:13:40.123 SparkContext: INFO: Created broadcast 159 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:40.123 : INFO: instruction count: 3: __C1278HailClassLoaderContainer. 2023-04-22 21:13:40.123 : INFO: instruction count: 3: __C1278HailClassLoaderContainer. 2023-04-22 21:13:40.123 : INFO: instruction count: 3: __C1280FSContainer. 2023-04-22 21:13:40.123 : INFO: instruction count: 3: __C1280FSContainer. 2023-04-22 21:13:40.125 : INFO: instruction count: 3: __C1282Compiled. 2023-04-22 21:13:40.125 : INFO: instruction count: 27: __C1282Compiled.apply 2023-04-22 21:13:40.125 : INFO: instruction count: 9: __C1282Compiled.setPartitionIndex 2023-04-22 21:13:40.125 : INFO: instruction count: 4: __C1282Compiled.addPartitionRegion 2023-04-22 21:13:40.125 : INFO: instruction count: 4: __C1282Compiled.setPool 2023-04-22 21:13:40.125 : INFO: instruction count: 3: __C1282Compiled.addHailClassLoader 2023-04-22 21:13:40.125 : INFO: instruction count: 3: __C1282Compiled.addFS 2023-04-22 21:13:40.125 : INFO: instruction count: 4: __C1282Compiled.addTaskContext 2023-04-22 21:13:40.125 : INFO: instruction count: 64: __C1282Compiled.addAndDecodeLiterals 2023-04-22 21:13:40.125 : INFO: instruction count: 18: __C1282Compiled.__m1288DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:40.125 : INFO: instruction count: 27: __C1282Compiled.__m1289DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.125 : INFO: instruction count: 58: __C1282Compiled.__m1290INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:13:40.125 : INFO: instruction count: 17: __C1282Compiled.__m1291INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:13:40.125 : INFO: instruction count: 31: __C1282Compiled.__m1292INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:40.125 : INFO: initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.126 : INFO: after optimize: compileLowerer, initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.126 : INFO: after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.127 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.127 : INFO: after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.127 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.131 : INFO: encoder cache hit 2023-04-22 21:13:40.131 MemoryStore: INFO: Block broadcast_160 stored as values in memory (estimated size 47.5 KiB, free 25.1 GiB) 2023-04-22 21:13:40.132 MemoryStore: INFO: Block broadcast_160_piece0 stored as bytes in memory (estimated size 17.5 KiB, free 25.1 GiB) 2023-04-22 21:13:40.135 BlockManagerInfo: INFO: Added broadcast_160_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 17.5 KiB, free: 25.3 GiB) 2023-04-22 21:13:40.136 SparkContext: INFO: Created broadcast 160 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:40.136 : INFO: instruction count: 3: __C1293HailClassLoaderContainer. 2023-04-22 21:13:40.136 : INFO: instruction count: 3: __C1293HailClassLoaderContainer. 2023-04-22 21:13:40.136 : INFO: instruction count: 3: __C1295FSContainer. 2023-04-22 21:13:40.136 : INFO: instruction count: 3: __C1295FSContainer. 2023-04-22 21:13:40.137 : INFO: instruction count: 3: __C1297Compiled. 2023-04-22 21:13:40.137 : INFO: instruction count: 27: __C1297Compiled.apply 2023-04-22 21:13:40.137 : INFO: instruction count: 9: __C1297Compiled.setPartitionIndex 2023-04-22 21:13:40.137 : INFO: instruction count: 4: __C1297Compiled.addPartitionRegion 2023-04-22 21:13:40.137 : INFO: instruction count: 4: __C1297Compiled.setPool 2023-04-22 21:13:40.137 : INFO: instruction count: 3: __C1297Compiled.addHailClassLoader 2023-04-22 21:13:40.137 : INFO: instruction count: 3: __C1297Compiled.addFS 2023-04-22 21:13:40.137 : INFO: instruction count: 4: __C1297Compiled.addTaskContext 2023-04-22 21:13:40.137 : INFO: instruction count: 64: __C1297Compiled.addAndDecodeLiterals 2023-04-22 21:13:40.137 : INFO: instruction count: 18: __C1297Compiled.__m1303DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:40.138 : INFO: instruction count: 27: __C1297Compiled.__m1304DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.138 : INFO: instruction count: 58: __C1297Compiled.__m1305INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:13:40.138 : INFO: instruction count: 17: __C1297Compiled.__m1306INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:13:40.138 : INFO: instruction count: 31: __C1297Compiled.__m1307INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:40.138 : INFO: initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.139 : INFO: after optimize: compileLowerer, initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.139 : INFO: after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.139 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.140 : INFO: after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.140 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{__cols:Array[Struct{s:String}]})) 2023-04-22 21:13:40.143 : INFO: encoder cache hit 2023-04-22 21:13:40.144 MemoryStore: INFO: Block broadcast_161 stored as values in memory (estimated size 47.5 KiB, free 25.1 GiB) 2023-04-22 21:13:40.156 MemoryStore: INFO: Block broadcast_161_piece0 stored as bytes in memory (estimated size 17.5 KiB, free 25.1 GiB) 2023-04-22 21:13:40.156 BlockManagerInfo: INFO: Added broadcast_161_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 17.5 KiB, free: 25.3 GiB) 2023-04-22 21:13:40.157 SparkContext: INFO: Created broadcast 161 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:40.157 : INFO: instruction count: 3: __C1308HailClassLoaderContainer. 2023-04-22 21:13:40.157 : INFO: instruction count: 3: __C1308HailClassLoaderContainer. 2023-04-22 21:13:40.157 : INFO: instruction count: 3: __C1310FSContainer. 2023-04-22 21:13:40.157 : INFO: instruction count: 3: __C1310FSContainer. 2023-04-22 21:13:40.158 : INFO: instruction count: 3: __C1312Compiled. 2023-04-22 21:13:40.158 : INFO: instruction count: 27: __C1312Compiled.apply 2023-04-22 21:13:40.158 : INFO: instruction count: 9: __C1312Compiled.setPartitionIndex 2023-04-22 21:13:40.158 : INFO: instruction count: 4: __C1312Compiled.addPartitionRegion 2023-04-22 21:13:40.158 : INFO: instruction count: 4: __C1312Compiled.setPool 2023-04-22 21:13:40.158 : INFO: instruction count: 3: __C1312Compiled.addHailClassLoader 2023-04-22 21:13:40.158 : INFO: instruction count: 3: __C1312Compiled.addFS 2023-04-22 21:13:40.158 : INFO: instruction count: 4: __C1312Compiled.addTaskContext 2023-04-22 21:13:40.158 : INFO: instruction count: 64: __C1312Compiled.addAndDecodeLiterals 2023-04-22 21:13:40.158 : INFO: instruction count: 18: __C1312Compiled.__m1318DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:40.159 : INFO: instruction count: 27: __C1312Compiled.__m1319DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.159 : INFO: instruction count: 58: __C1312Compiled.__m1320INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:13:40.159 : INFO: instruction count: 17: __C1312Compiled.__m1321INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:13:40.159 : INFO: instruction count: 31: __C1312Compiled.__m1322INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:40.162 : INFO: encoder cache hit 2023-04-22 21:13:40.163 : INFO: initial IR: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) 2023-04-22 21:13:40.163 : INFO: after LowerAndExecuteShuffles: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) 2023-04-22 21:13:40.164 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:40.164 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) 2023-04-22 21:13:40.165 : INFO: LowerOrInterpretNonCompilable: whole stage code generation is a go! 2023-04-22 21:13:40.165 : INFO: lowering result: TableGetGlobals 2023-04-22 21:13:40.179 : INFO: compiling and evaluating result: TableGetGlobals 2023-04-22 21:13:40.180 : INFO: initial IR: IR size 9: (Let __iruid_1096 (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (MakeStruct (partitionIndex (I64 0)) (partitionPath (Str "/fg/saxena..."))))) (I32 0)) (Ref __iruid_1096)) 2023-04-22 21:13:40.182 : INFO: after optimize: relationalLowerer, initial IR: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:40.186 : INFO: after LowerMatrixToTable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:40.187 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:40.188 : INFO: after LiftRelationalValuesToRelationalLets: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:40.189 : INFO: after EvalRelationalLets: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:40.189 : INFO: after LowerAndExecuteShuffles: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:40.191 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:40.192 : INFO: after LowerOrInterpretNonCompilable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:40.193 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:13:40.210 : INFO: initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.212 : INFO: after optimize: compileLowerer, initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.212 : INFO: after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.214 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.215 : INFO: after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.217 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.235 : INFO: encoder cache hit 2023-04-22 21:13:40.236 MemoryStore: INFO: Block broadcast_162 stored as values in memory (estimated size 216.0 B, free 25.1 GiB) 2023-04-22 21:13:40.237 MemoryStore: INFO: Block broadcast_162_piece0 stored as bytes in memory (estimated size 162.0 B, free 25.1 GiB) 2023-04-22 21:13:40.238 BlockManagerInfo: INFO: Added broadcast_162_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:13:40.238 SparkContext: INFO: Created broadcast 162 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:40.238 : INFO: instruction count: 3: __C1323HailClassLoaderContainer. 2023-04-22 21:13:40.238 : INFO: instruction count: 3: __C1323HailClassLoaderContainer. 2023-04-22 21:13:40.238 : INFO: instruction count: 3: __C1325FSContainer. 2023-04-22 21:13:40.239 : INFO: instruction count: 3: __C1325FSContainer. 2023-04-22 21:13:40.243 : INFO: instruction count: 3: __C1327Compiled. 2023-04-22 21:13:40.243 : INFO: instruction count: 45: __C1327Compiled.apply 2023-04-22 21:13:40.244 : INFO: instruction count: 475: __C1327Compiled.__m1329split_ToArray 2023-04-22 21:13:40.244 : INFO: instruction count: 31: __C1327Compiled.__m1337DECODE_r_struct_of_r_array_of_r_float64ANDr_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.244 : INFO: instruction count: 22: __C1327Compiled.__m1338SKIP_r_array_of_r_float64 2023-04-22 21:13:40.244 : INFO: instruction count: 3: __C1327Compiled.__m1339SKIP_r_float64 2023-04-22 21:13:40.244 : INFO: instruction count: 58: __C1327Compiled.__m1340INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.244 : INFO: instruction count: 26: __C1327Compiled.__m1341INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.244 : INFO: instruction count: 31: __C1327Compiled.__m1342INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:40.244 : INFO: instruction count: 58: __C1327Compiled.__m1343INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:40.244 : INFO: instruction count: 10: __C1327Compiled.__m1344INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:40.244 : INFO: instruction count: 12: __C1327Compiled.__m1347setup_jab 2023-04-22 21:13:40.244 : INFO: instruction count: 35: __C1327Compiled.__m1352arrayref_bounds_check 2023-04-22 21:13:40.245 : INFO: instruction count: 9: __C1327Compiled.setPartitionIndex 2023-04-22 21:13:40.245 : INFO: instruction count: 4: __C1327Compiled.addPartitionRegion 2023-04-22 21:13:40.245 : INFO: instruction count: 4: __C1327Compiled.setPool 2023-04-22 21:13:40.245 : INFO: instruction count: 3: __C1327Compiled.addHailClassLoader 2023-04-22 21:13:40.245 : INFO: instruction count: 3: __C1327Compiled.addFS 2023-04-22 21:13:40.245 : INFO: instruction count: 4: __C1327Compiled.addTaskContext 2023-04-22 21:13:40.246 : INFO: instruction count: 41: __C1327Compiled.addAndDecodeLiterals 2023-04-22 21:13:40.246 : INFO: instruction count: 27: __C1327Compiled.__m1357DECODE_r_struct_of_r_struct_of_r_int64ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.246 : INFO: instruction count: 26: __C1327Compiled.__m1358INPLACE_DECODE_r_struct_of_r_int64ANDr_binaryEND_TO_r_struct_of_r_int64ANDr_stringEND 2023-04-22 21:13:40.246 : INFO: instruction count: 10: __C1327Compiled.__m1359INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:13:40.246 : INFO: initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.249 : INFO: after optimize: compileLowerer, initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.249 : INFO: after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.273 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.275 : INFO: after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.276 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.297 : INFO: encoder cache hit 2023-04-22 21:13:40.297 MemoryStore: INFO: Block broadcast_163 stored as values in memory (estimated size 216.0 B, free 25.1 GiB) 2023-04-22 21:13:40.298 MemoryStore: INFO: Block broadcast_163_piece0 stored as bytes in memory (estimated size 162.0 B, free 25.1 GiB) 2023-04-22 21:13:40.299 BlockManagerInfo: INFO: Added broadcast_163_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:13:40.299 SparkContext: INFO: Created broadcast 163 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:40.299 : INFO: instruction count: 3: __C1360HailClassLoaderContainer. 2023-04-22 21:13:40.299 : INFO: instruction count: 3: __C1360HailClassLoaderContainer. 2023-04-22 21:13:40.299 : INFO: instruction count: 3: __C1362FSContainer. 2023-04-22 21:13:40.299 : INFO: instruction count: 3: __C1362FSContainer. 2023-04-22 21:13:40.303 : INFO: instruction count: 3: __C1364Compiled. 2023-04-22 21:13:40.303 : INFO: instruction count: 45: __C1364Compiled.apply 2023-04-22 21:13:40.304 : INFO: instruction count: 475: __C1364Compiled.__m1366split_ToArray 2023-04-22 21:13:40.304 : INFO: instruction count: 31: __C1364Compiled.__m1374DECODE_r_struct_of_r_array_of_r_float64ANDr_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.304 : INFO: instruction count: 22: __C1364Compiled.__m1375SKIP_r_array_of_r_float64 2023-04-22 21:13:40.304 : INFO: instruction count: 3: __C1364Compiled.__m1376SKIP_r_float64 2023-04-22 21:13:40.304 : INFO: instruction count: 58: __C1364Compiled.__m1377INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.304 : INFO: instruction count: 26: __C1364Compiled.__m1378INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.304 : INFO: instruction count: 31: __C1364Compiled.__m1379INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:40.304 : INFO: instruction count: 58: __C1364Compiled.__m1380INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:40.304 : INFO: instruction count: 10: __C1364Compiled.__m1381INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:40.304 : INFO: instruction count: 12: __C1364Compiled.__m1384setup_jab 2023-04-22 21:13:40.304 : INFO: instruction count: 35: __C1364Compiled.__m1389arrayref_bounds_check 2023-04-22 21:13:40.305 : INFO: instruction count: 9: __C1364Compiled.setPartitionIndex 2023-04-22 21:13:40.305 : INFO: instruction count: 4: __C1364Compiled.addPartitionRegion 2023-04-22 21:13:40.305 : INFO: instruction count: 4: __C1364Compiled.setPool 2023-04-22 21:13:40.305 : INFO: instruction count: 3: __C1364Compiled.addHailClassLoader 2023-04-22 21:13:40.305 : INFO: instruction count: 3: __C1364Compiled.addFS 2023-04-22 21:13:40.305 : INFO: instruction count: 4: __C1364Compiled.addTaskContext 2023-04-22 21:13:40.305 : INFO: instruction count: 41: __C1364Compiled.addAndDecodeLiterals 2023-04-22 21:13:40.305 : INFO: instruction count: 27: __C1364Compiled.__m1394DECODE_r_struct_of_r_struct_of_r_int64ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.305 : INFO: instruction count: 26: __C1364Compiled.__m1395INPLACE_DECODE_r_struct_of_r_int64ANDr_binaryEND_TO_r_struct_of_r_int64ANDr_stringEND 2023-04-22 21:13:40.305 : INFO: instruction count: 10: __C1364Compiled.__m1396INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:13:40.306 : INFO: initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.307 : INFO: after optimize: compileLowerer, initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.322 : INFO: after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.323 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.325 : INFO: after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.326 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:13:40.338 : INFO: encoder cache hit 2023-04-22 21:13:40.339 MemoryStore: INFO: Block broadcast_164 stored as values in memory (estimated size 216.0 B, free 25.1 GiB) 2023-04-22 21:13:40.340 MemoryStore: INFO: Block broadcast_164_piece0 stored as bytes in memory (estimated size 162.0 B, free 25.1 GiB) 2023-04-22 21:13:40.341 BlockManagerInfo: INFO: Added broadcast_164_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:13:40.354 SparkContext: INFO: Created broadcast 164 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:40.354 : INFO: instruction count: 3: __C1397HailClassLoaderContainer. 2023-04-22 21:13:40.354 : INFO: instruction count: 3: __C1397HailClassLoaderContainer. 2023-04-22 21:13:40.354 : INFO: instruction count: 3: __C1399FSContainer. 2023-04-22 21:13:40.354 : INFO: instruction count: 3: __C1399FSContainer. 2023-04-22 21:13:40.358 : INFO: instruction count: 3: __C1401Compiled. 2023-04-22 21:13:40.358 : INFO: instruction count: 45: __C1401Compiled.apply 2023-04-22 21:13:40.359 : INFO: instruction count: 475: __C1401Compiled.__m1403split_ToArray 2023-04-22 21:13:40.359 : INFO: instruction count: 31: __C1401Compiled.__m1411DECODE_r_struct_of_r_array_of_r_float64ANDr_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.359 : INFO: instruction count: 22: __C1401Compiled.__m1412SKIP_r_array_of_r_float64 2023-04-22 21:13:40.359 : INFO: instruction count: 3: __C1401Compiled.__m1413SKIP_r_float64 2023-04-22 21:13:40.359 : INFO: instruction count: 58: __C1401Compiled.__m1414INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.359 : INFO: instruction count: 26: __C1401Compiled.__m1415INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.359 : INFO: instruction count: 31: __C1401Compiled.__m1416INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:40.359 : INFO: instruction count: 58: __C1401Compiled.__m1417INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:40.359 : INFO: instruction count: 10: __C1401Compiled.__m1418INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:40.359 : INFO: instruction count: 12: __C1401Compiled.__m1421setup_jab 2023-04-22 21:13:40.359 : INFO: instruction count: 35: __C1401Compiled.__m1426arrayref_bounds_check 2023-04-22 21:13:40.359 : INFO: instruction count: 9: __C1401Compiled.setPartitionIndex 2023-04-22 21:13:40.360 : INFO: instruction count: 4: __C1401Compiled.addPartitionRegion 2023-04-22 21:13:40.360 : INFO: instruction count: 4: __C1401Compiled.setPool 2023-04-22 21:13:40.360 : INFO: instruction count: 3: __C1401Compiled.addHailClassLoader 2023-04-22 21:13:40.360 : INFO: instruction count: 3: __C1401Compiled.addFS 2023-04-22 21:13:40.360 : INFO: instruction count: 4: __C1401Compiled.addTaskContext 2023-04-22 21:13:40.360 : INFO: instruction count: 41: __C1401Compiled.addAndDecodeLiterals 2023-04-22 21:13:40.360 : INFO: instruction count: 27: __C1401Compiled.__m1431DECODE_r_struct_of_r_struct_of_r_int64ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.360 : INFO: instruction count: 26: __C1401Compiled.__m1432INPLACE_DECODE_r_struct_of_r_int64ANDr_binaryEND_TO_r_struct_of_r_int64ANDr_stringEND 2023-04-22 21:13:40.360 : INFO: instruction count: 10: __C1401Compiled.__m1433INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:13:40.408 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (320.0K blocks / 192.0K chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:40.410 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (704.0K blocks / 320.0K chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:40.413 : INFO: encoder cache miss (28 hits, 16 misses, 0.636) 2023-04-22 21:13:40.415 : INFO: instruction count: 3: __C1434HailClassLoaderContainer. 2023-04-22 21:13:40.415 : INFO: instruction count: 3: __C1434HailClassLoaderContainer. 2023-04-22 21:13:40.415 : INFO: instruction count: 3: __C1436FSContainer. 2023-04-22 21:13:40.415 : INFO: instruction count: 3: __C1436FSContainer. 2023-04-22 21:13:40.419 : INFO: instruction count: 3: __C1438etypeEncode. 2023-04-22 21:13:40.419 : INFO: instruction count: 7: __C1438etypeEncode.apply 2023-04-22 21:13:40.419 : INFO: instruction count: 25: __C1438etypeEncode.__m1440ENCODE_SBaseStructPointer_TO_r_struct_of_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND 2023-04-22 21:13:40.419 : INFO: instruction count: 35: __C1438etypeEncode.__m1441ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END 2023-04-22 21:13:40.419 : INFO: instruction count: 37: __C1438etypeEncode.__m1442ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_array_of_r_float64END 2023-04-22 21:13:40.419 : INFO: instruction count: 16: __C1438etypeEncode.__m1443ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:13:40.419 : INFO: instruction count: 39: __C1438etypeEncode.__m1444ENCODE_SIndexablePointer_TO_r_array_of_r_float64 2023-04-22 21:13:40.419 : INFO: instruction count: 4: __C1438etypeEncode.__m1445ENCODE_SFloat64$_TO_r_float64 2023-04-22 21:13:40.447 : INFO: took 282.701ms 2023-04-22 21:13:40.447 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.448 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.448 : INFO: initial IR: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.448 : INFO: after optimize: relationalLowerer, initial IR: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.448 : INFO: after LowerMatrixToTable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.448 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.448 : INFO: after LiftRelationalValuesToRelationalLets: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.448 : INFO: after EvalRelationalLets: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.448 : INFO: after LowerAndExecuteShuffles: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.448 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.448 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.449 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:13:40.451 : INFO: initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.451 : INFO: after optimize: compileLowerer, initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.451 : INFO: after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.452 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.452 : INFO: after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.453 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.462 : INFO: encoder cache hit 2023-04-22 21:13:40.462 MemoryStore: INFO: Block broadcast_165 stored as values in memory (estimated size 388.0 KiB, free 25.1 GiB) 2023-04-22 21:13:40.466 MemoryStore: INFO: Block broadcast_165_piece0 stored as bytes in memory (estimated size 351.3 KiB, free 25.1 GiB) 2023-04-22 21:13:40.467 BlockManagerInfo: INFO: Added broadcast_165_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 351.3 KiB, free: 25.3 GiB) 2023-04-22 21:13:40.480 SparkContext: INFO: Created broadcast 165 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:40.480 : INFO: instruction count: 3: __C1446HailClassLoaderContainer. 2023-04-22 21:13:40.480 : INFO: instruction count: 3: __C1446HailClassLoaderContainer. 2023-04-22 21:13:40.480 : INFO: instruction count: 3: __C1448FSContainer. 2023-04-22 21:13:40.480 : INFO: instruction count: 3: __C1448FSContainer. 2023-04-22 21:13:40.482 : INFO: instruction count: 3: __C1450Compiled. 2023-04-22 21:13:40.482 : INFO: instruction count: 27: __C1450Compiled.apply 2023-04-22 21:13:40.482 : INFO: instruction count: 9: __C1450Compiled.setPartitionIndex 2023-04-22 21:13:40.482 : INFO: instruction count: 4: __C1450Compiled.addPartitionRegion 2023-04-22 21:13:40.482 : INFO: instruction count: 4: __C1450Compiled.setPool 2023-04-22 21:13:40.482 : INFO: instruction count: 3: __C1450Compiled.addHailClassLoader 2023-04-22 21:13:40.482 : INFO: instruction count: 3: __C1450Compiled.addFS 2023-04-22 21:13:40.482 : INFO: instruction count: 4: __C1450Compiled.addTaskContext 2023-04-22 21:13:40.482 : INFO: instruction count: 64: __C1450Compiled.addAndDecodeLiterals 2023-04-22 21:13:40.482 : INFO: instruction count: 18: __C1450Compiled.__m1456DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:40.482 : INFO: instruction count: 27: __C1450Compiled.__m1457DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.482 : INFO: instruction count: 58: __C1450Compiled.__m1458INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.482 : INFO: instruction count: 26: __C1450Compiled.__m1459INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.482 : INFO: instruction count: 31: __C1450Compiled.__m1460INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:40.482 : INFO: instruction count: 58: __C1450Compiled.__m1461INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:40.483 : INFO: instruction count: 10: __C1450Compiled.__m1462INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:40.483 : INFO: initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.483 : INFO: after optimize: compileLowerer, initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.483 : INFO: after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.484 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.484 : INFO: after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.485 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.492 : INFO: encoder cache hit 2023-04-22 21:13:40.493 MemoryStore: INFO: Block broadcast_166 stored as values in memory (estimated size 388.0 KiB, free 25.1 GiB) 2023-04-22 21:13:40.509 MemoryStore: INFO: Block broadcast_166_piece0 stored as bytes in memory (estimated size 351.3 KiB, free 25.1 GiB) 2023-04-22 21:13:40.509 BlockManagerInfo: INFO: Added broadcast_166_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 351.3 KiB, free: 25.3 GiB) 2023-04-22 21:13:40.510 SparkContext: INFO: Created broadcast 166 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:40.510 : INFO: instruction count: 3: __C1463HailClassLoaderContainer. 2023-04-22 21:13:40.510 : INFO: instruction count: 3: __C1463HailClassLoaderContainer. 2023-04-22 21:13:40.510 : INFO: instruction count: 3: __C1465FSContainer. 2023-04-22 21:13:40.510 : INFO: instruction count: 3: __C1465FSContainer. 2023-04-22 21:13:40.511 : INFO: instruction count: 3: __C1467Compiled. 2023-04-22 21:13:40.511 : INFO: instruction count: 27: __C1467Compiled.apply 2023-04-22 21:13:40.511 : INFO: instruction count: 9: __C1467Compiled.setPartitionIndex 2023-04-22 21:13:40.511 : INFO: instruction count: 4: __C1467Compiled.addPartitionRegion 2023-04-22 21:13:40.512 : INFO: instruction count: 4: __C1467Compiled.setPool 2023-04-22 21:13:40.512 : INFO: instruction count: 3: __C1467Compiled.addHailClassLoader 2023-04-22 21:13:40.512 : INFO: instruction count: 3: __C1467Compiled.addFS 2023-04-22 21:13:40.512 : INFO: instruction count: 4: __C1467Compiled.addTaskContext 2023-04-22 21:13:40.512 : INFO: instruction count: 64: __C1467Compiled.addAndDecodeLiterals 2023-04-22 21:13:40.512 : INFO: instruction count: 18: __C1467Compiled.__m1473DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:40.512 : INFO: instruction count: 27: __C1467Compiled.__m1474DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.512 : INFO: instruction count: 58: __C1467Compiled.__m1475INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.512 : INFO: instruction count: 26: __C1467Compiled.__m1476INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.512 : INFO: instruction count: 31: __C1467Compiled.__m1477INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:40.512 : INFO: instruction count: 58: __C1467Compiled.__m1478INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:40.512 : INFO: instruction count: 10: __C1467Compiled.__m1479INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:40.514 : INFO: initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.515 : INFO: after optimize: compileLowerer, initial IR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.515 : INFO: after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.515 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.516 : INFO: after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.516 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 2: (MakeTuple (0) (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]})) 2023-04-22 21:13:40.537 : INFO: encoder cache hit 2023-04-22 21:13:40.537 MemoryStore: INFO: Block broadcast_167 stored as values in memory (estimated size 388.0 KiB, free 25.1 GiB) 2023-04-22 21:13:40.540 MemoryStore: INFO: Block broadcast_167_piece0 stored as bytes in memory (estimated size 351.3 KiB, free 25.1 GiB) 2023-04-22 21:13:40.541 BlockManagerInfo: INFO: Added broadcast_167_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 351.3 KiB, free: 25.3 GiB) 2023-04-22 21:13:40.541 SparkContext: INFO: Created broadcast 167 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:40.541 : INFO: instruction count: 3: __C1480HailClassLoaderContainer. 2023-04-22 21:13:40.542 : INFO: instruction count: 3: __C1480HailClassLoaderContainer. 2023-04-22 21:13:40.542 : INFO: instruction count: 3: __C1482FSContainer. 2023-04-22 21:13:40.542 : INFO: instruction count: 3: __C1482FSContainer. 2023-04-22 21:13:40.543 : INFO: instruction count: 3: __C1484Compiled. 2023-04-22 21:13:40.543 : INFO: instruction count: 27: __C1484Compiled.apply 2023-04-22 21:13:40.543 : INFO: instruction count: 9: __C1484Compiled.setPartitionIndex 2023-04-22 21:13:40.543 : INFO: instruction count: 4: __C1484Compiled.addPartitionRegion 2023-04-22 21:13:40.543 : INFO: instruction count: 4: __C1484Compiled.setPool 2023-04-22 21:13:40.543 : INFO: instruction count: 3: __C1484Compiled.addHailClassLoader 2023-04-22 21:13:40.543 : INFO: instruction count: 3: __C1484Compiled.addFS 2023-04-22 21:13:40.543 : INFO: instruction count: 4: __C1484Compiled.addTaskContext 2023-04-22 21:13:40.543 : INFO: instruction count: 64: __C1484Compiled.addAndDecodeLiterals 2023-04-22 21:13:40.543 : INFO: instruction count: 18: __C1484Compiled.__m1490DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:40.544 : INFO: instruction count: 27: __C1484Compiled.__m1491DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:40.544 : INFO: instruction count: 58: __C1484Compiled.__m1492INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.544 : INFO: instruction count: 26: __C1484Compiled.__m1493INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:40.544 : INFO: instruction count: 31: __C1484Compiled.__m1494INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:40.544 : INFO: instruction count: 58: __C1484Compiled.__m1495INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:40.544 : INFO: instruction count: 10: __C1484Compiled.__m1496INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:40.566 : INFO: encoder cache hit 2023-04-22 21:13:40.569 : INFO: initial IR: IR size 53: (TableAggregate (TableParallelize None (Let __iruid_1074 (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) (Let __iruid_1075 (ToDict (StreamMap __iruid_1076 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1076)) (SelectFields (scores) (Ref __iruid_1076))))) (Let __iruid_1077 (ToArray (StreamMap __iruid_1078 (ToStream False (GetField __cols (Ref __iruid_1074))) (InsertFields (Ref __iruid_1078) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1075) (MakeStruct (s (GetField s (Ref __iruid_1078))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_1079 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1077)) (I32 1)) (Let __iruid_1080 (ArrayRef -1 (Ref __iruid_1077) (Ref __iruid_1079)) (InsertFields (SelectFields () (Ref __iruid_1080)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1080))))))))) (global (SelectFields () (Ref __iruid_1074)))))))) (ApplyAggOp Sum () ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref row))))))) 2023-04-22 21:13:40.569 : INFO: after LowerAndExecuteShuffles: IR size 53: (TableAggregate (TableParallelize None (Let __iruid_1074 (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) (Let __iruid_1075 (ToDict (StreamMap __iruid_1076 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1076)) (SelectFields (scores) (Ref __iruid_1076))))) (Let __iruid_1077 (ToArray (StreamMap __iruid_1078 (ToStream False (GetField __cols (Ref __iruid_1074))) (InsertFields (Ref __iruid_1078) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1075) (MakeStruct (s (GetField s (Ref __iruid_1078))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_1079 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1077)) (I32 1)) (Let __iruid_1080 (ArrayRef -1 (Ref __iruid_1077) (Ref __iruid_1079)) (InsertFields (SelectFields () (Ref __iruid_1080)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1080))))))))) (global (SelectFields () (Ref __iruid_1074)))))))) (ApplyAggOp Sum () ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref row))))))) 2023-04-22 21:13:40.617 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 53: (TableAggregate (TableParallelize None (Let __iruid_1140 (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) (Let __iruid_1141 (ToDict (StreamMap __iruid_1142 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1142)) (SelectFields (scores) (Ref __iruid_1142))))) (Let __iruid_1143 (ToArray (StreamMap __iruid_1144 (ToStream False (GetField __cols (Ref __iruid_1140))) (InsertFields (Ref __iruid_1144) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1141) (MakeStruct (s (GetField s (Ref __iruid_1144))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_1145 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1143)) (I32 1)) (Let __iruid_1146 (ArrayRef -1 (Ref __iruid_1143) (Ref __iruid_1145)) (InsertFields (SelectFields () (Ref __iruid_1146)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1146))))))))) (global (SelectFields () (Ref __iruid_1140)))))))) (ApplyAggOp Sum () ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref row))))))) 2023-04-22 21:13:40.620 : INFO: LowerOrInterpretNonCompilable: whole stage code generation is a go! 2023-04-22 21:13:40.620 : INFO: lowering result: TableAggregate 2023-04-22 21:13:40.662 : INFO: Aggregate: useTreeAggregate=false 2023-04-22 21:13:40.662 : INFO: Aggregate: commutative=true 2023-04-22 21:13:40.664 : INFO: compiling and evaluating result: TableAggregate 2023-04-22 21:13:40.667 : INFO: initial IR: IR size 151: (Let __iruid_1157 (Let __iruid_1140 (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) (Let __iruid_1141 (ToDict (StreamMap __iruid_1142 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1142)) (SelectFields (scores) (Ref __iruid_1142))))) (Let __iruid_1143 (ToArray (StreamMap __iruid_1144 (ToStream False (GetField __cols (Ref __iruid_1140))) (InsertFields (Ref __iruid_1144) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1141) (MakeStruct (s (GetField s (Ref __iruid_1144))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_1145 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1143)) (I32 1)) (Let __iruid_1146 (ArrayRef -1 (Ref __iruid_1143) (Ref __iruid_1145)) (InsertFields (SelectFields () (Ref __iruid_1146)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1146))))))))) (global (SelectFields () (Ref __iruid_1140))))))) (Let __iruid_1167 (GetField global (Ref __iruid_1157)) (Let __iruid_1169 (Let global (Ref __iruid_1167) (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64))))) (Let __iruid_1171 (CollectDistributedArray table_aggregate_singlestage __iruid_1168 __iruid_1170 (Let __iruid_1158 (ArrayLen (GetField rows (Ref __iruid_1157))) (Let __iruid_1160 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1159 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1159) (Ref __iruid_1158)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1158))) (Let __iruid_1164 (GetField rows (Ref __iruid_1157)) (StreamMap __iruid_1165 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1166 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1160) (Ref __iruid_1165)) (ArrayRef -1 (Ref __iruid_1160) (ApplyBinaryPrimOp Add (Ref __iruid_1165) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1164) (Ref __iruid_1166)))))))) (MakeStruct (__iruid_1167 (Ref __iruid_1167)) (__iruid_1169 (Ref __iruid_1169))) (Let __iruid_1169 (GetField __iruid_1169 (Ref __iruid_1170)) (Let __iruid_1167 (GetField __iruid_1167 (Ref __iruid_1170)) (Let global (Ref __iruid_1167) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1169)))) (StreamFor row (ToStream True (Ref __iruid_1168)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref row))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64))))))) (NA String)) (Let global (Ref __iruid_1167) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1169)))) (StreamFor __iruid_1172 (ToStream True (Ref __iruid_1171)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1172)))))) (Let __iruid_1156 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1156))))))))) 2023-04-22 21:13:40.707 : INFO: Prune: MakeStruct: eliminating field '__iruid_1167' 2023-04-22 21:13:40.761 : INFO: Prune: MakeStruct: eliminating field 'global' 2023-04-22 21:13:40.801 : INFO: after optimize: relationalLowerer, initial IR: IR size 132: (Let __iruid_1280 (ToDict (StreamMap __iruid_1281 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1281)) (SelectFields (scores) (Ref __iruid_1281))))) (Let __iruid_1282 (ToArray (StreamMap __iruid_1283 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1283) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1280) (MakeStruct (s (GetField s (Ref __iruid_1283))))))))) (Let __iruid_1284 (MakeStruct (rows (ToArray (StreamMap __iruid_1285 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1282)) (I32 1)) (Let __iruid_1286 (ArrayRef -1 (Ref __iruid_1282) (Ref __iruid_1285)) (InsertFields (SelectFields () (Ref __iruid_1286)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1286)))))))))) (Let __iruid_1287 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1288 (CollectDistributedArray table_aggregate_singlestage __iruid_1289 __iruid_1290 (Let __iruid_1291 (ArrayLen (GetField rows (Ref __iruid_1284))) (Let __iruid_1292 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1293 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1293) (Ref __iruid_1291)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1291))) (Let __iruid_1294 (GetField rows (Ref __iruid_1284)) (StreamMap __iruid_1295 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1296 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1292) (Ref __iruid_1295)) (ArrayRef -1 (Ref __iruid_1292) (ApplyBinaryPrimOp Add (Ref __iruid_1295) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1294) (Ref __iruid_1296)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1287))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1290))))) (StreamFor __iruid_1297 (ToStream True (Ref __iruid_1289)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1297))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1287)))) (StreamFor __iruid_1298 (ToStream True (Ref __iruid_1288)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1298)))))) (Let __iruid_1299 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1299))))))))) 2023-04-22 21:13:40.815 : INFO: after LowerMatrixToTable: IR size 132: (Let __iruid_1280 (ToDict (StreamMap __iruid_1281 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1281)) (SelectFields (scores) (Ref __iruid_1281))))) (Let __iruid_1282 (ToArray (StreamMap __iruid_1283 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1283) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1280) (MakeStruct (s (GetField s (Ref __iruid_1283))))))))) (Let __iruid_1284 (MakeStruct (rows (ToArray (StreamMap __iruid_1285 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1282)) (I32 1)) (Let __iruid_1286 (ArrayRef -1 (Ref __iruid_1282) (Ref __iruid_1285)) (InsertFields (SelectFields () (Ref __iruid_1286)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1286)))))))))) (Let __iruid_1287 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1288 (CollectDistributedArray table_aggregate_singlestage __iruid_1289 __iruid_1290 (Let __iruid_1291 (ArrayLen (GetField rows (Ref __iruid_1284))) (Let __iruid_1292 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1293 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1293) (Ref __iruid_1291)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1291))) (Let __iruid_1294 (GetField rows (Ref __iruid_1284)) (StreamMap __iruid_1295 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1296 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1292) (Ref __iruid_1295)) (ArrayRef -1 (Ref __iruid_1292) (ApplyBinaryPrimOp Add (Ref __iruid_1295) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1294) (Ref __iruid_1296)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1287))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1290))))) (StreamFor __iruid_1297 (ToStream True (Ref __iruid_1289)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1297))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1287)))) (StreamFor __iruid_1298 (ToStream True (Ref __iruid_1288)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1298)))))) (Let __iruid_1299 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1299))))))))) 2023-04-22 21:13:40.923 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 132: (Let __iruid_1412 (ToDict (StreamMap __iruid_1413 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1413)) (SelectFields (scores) (Ref __iruid_1413))))) (Let __iruid_1414 (ToArray (StreamMap __iruid_1415 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1415) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1412) (MakeStruct (s (GetField s (Ref __iruid_1415))))))))) (Let __iruid_1416 (MakeStruct (rows (ToArray (StreamMap __iruid_1417 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1414)) (I32 1)) (Let __iruid_1418 (ArrayRef -1 (Ref __iruid_1414) (Ref __iruid_1417)) (InsertFields (SelectFields () (Ref __iruid_1418)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1418)))))))))) (Let __iruid_1419 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1420 (CollectDistributedArray table_aggregate_singlestage __iruid_1421 __iruid_1422 (Let __iruid_1423 (ArrayLen (GetField rows (Ref __iruid_1416))) (Let __iruid_1424 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1425 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1425) (Ref __iruid_1423)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1423))) (Let __iruid_1426 (GetField rows (Ref __iruid_1416)) (StreamMap __iruid_1427 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1428 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1424) (Ref __iruid_1427)) (ArrayRef -1 (Ref __iruid_1424) (ApplyBinaryPrimOp Add (Ref __iruid_1427) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1426) (Ref __iruid_1428)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1419))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1422))))) (StreamFor __iruid_1429 (ToStream True (Ref __iruid_1421)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1429))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1419)))) (StreamFor __iruid_1430 (ToStream True (Ref __iruid_1420)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1430)))))) (Let __iruid_1431 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1431))))))))) 2023-04-22 21:13:40.928 : INFO: after LiftRelationalValuesToRelationalLets: IR size 132: (Let __iruid_1412 (ToDict (StreamMap __iruid_1413 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1413)) (SelectFields (scores) (Ref __iruid_1413))))) (Let __iruid_1414 (ToArray (StreamMap __iruid_1415 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1415) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1412) (MakeStruct (s (GetField s (Ref __iruid_1415))))))))) (Let __iruid_1416 (MakeStruct (rows (ToArray (StreamMap __iruid_1417 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1414)) (I32 1)) (Let __iruid_1418 (ArrayRef -1 (Ref __iruid_1414) (Ref __iruid_1417)) (InsertFields (SelectFields () (Ref __iruid_1418)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1418)))))))))) (Let __iruid_1419 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1420 (CollectDistributedArray table_aggregate_singlestage __iruid_1421 __iruid_1422 (Let __iruid_1423 (ArrayLen (GetField rows (Ref __iruid_1416))) (Let __iruid_1424 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1425 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1425) (Ref __iruid_1423)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1423))) (Let __iruid_1426 (GetField rows (Ref __iruid_1416)) (StreamMap __iruid_1427 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1428 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1424) (Ref __iruid_1427)) (ArrayRef -1 (Ref __iruid_1424) (ApplyBinaryPrimOp Add (Ref __iruid_1427) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1426) (Ref __iruid_1428)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1419))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1422))))) (StreamFor __iruid_1429 (ToStream True (Ref __iruid_1421)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1429))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1419)))) (StreamFor __iruid_1430 (ToStream True (Ref __iruid_1420)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1430)))))) (Let __iruid_1431 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1431))))))))) 2023-04-22 21:13:40.931 : INFO: after EvalRelationalLets: IR size 132: (Let __iruid_1412 (ToDict (StreamMap __iruid_1413 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1413)) (SelectFields (scores) (Ref __iruid_1413))))) (Let __iruid_1414 (ToArray (StreamMap __iruid_1415 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1415) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1412) (MakeStruct (s (GetField s (Ref __iruid_1415))))))))) (Let __iruid_1416 (MakeStruct (rows (ToArray (StreamMap __iruid_1417 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1414)) (I32 1)) (Let __iruid_1418 (ArrayRef -1 (Ref __iruid_1414) (Ref __iruid_1417)) (InsertFields (SelectFields () (Ref __iruid_1418)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1418)))))))))) (Let __iruid_1419 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1420 (CollectDistributedArray table_aggregate_singlestage __iruid_1421 __iruid_1422 (Let __iruid_1423 (ArrayLen (GetField rows (Ref __iruid_1416))) (Let __iruid_1424 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1425 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1425) (Ref __iruid_1423)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1423))) (Let __iruid_1426 (GetField rows (Ref __iruid_1416)) (StreamMap __iruid_1427 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1428 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1424) (Ref __iruid_1427)) (ArrayRef -1 (Ref __iruid_1424) (ApplyBinaryPrimOp Add (Ref __iruid_1427) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1426) (Ref __iruid_1428)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1419))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1422))))) (StreamFor __iruid_1429 (ToStream True (Ref __iruid_1421)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1429))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1419)))) (StreamFor __iruid_1430 (ToStream True (Ref __iruid_1420)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1430)))))) (Let __iruid_1431 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1431))))))))) 2023-04-22 21:13:40.934 : INFO: after LowerAndExecuteShuffles: IR size 132: (Let __iruid_1412 (ToDict (StreamMap __iruid_1413 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1413)) (SelectFields (scores) (Ref __iruid_1413))))) (Let __iruid_1414 (ToArray (StreamMap __iruid_1415 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1415) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1412) (MakeStruct (s (GetField s (Ref __iruid_1415))))))))) (Let __iruid_1416 (MakeStruct (rows (ToArray (StreamMap __iruid_1417 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1414)) (I32 1)) (Let __iruid_1418 (ArrayRef -1 (Ref __iruid_1414) (Ref __iruid_1417)) (InsertFields (SelectFields () (Ref __iruid_1418)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1418)))))))))) (Let __iruid_1419 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1420 (CollectDistributedArray table_aggregate_singlestage __iruid_1421 __iruid_1422 (Let __iruid_1423 (ArrayLen (GetField rows (Ref __iruid_1416))) (Let __iruid_1424 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1425 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1425) (Ref __iruid_1423)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1423))) (Let __iruid_1426 (GetField rows (Ref __iruid_1416)) (StreamMap __iruid_1427 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1428 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1424) (Ref __iruid_1427)) (ArrayRef -1 (Ref __iruid_1424) (ApplyBinaryPrimOp Add (Ref __iruid_1427) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1426) (Ref __iruid_1428)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1419))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1422))))) (StreamFor __iruid_1429 (ToStream True (Ref __iruid_1421)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1429))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1419)))) (StreamFor __iruid_1430 (ToStream True (Ref __iruid_1420)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1430)))))) (Let __iruid_1431 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1431))))))))) 2023-04-22 21:13:41.020 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 132: (Let __iruid_1544 (ToDict (StreamMap __iruid_1545 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1545)) (SelectFields (scores) (Ref __iruid_1545))))) (Let __iruid_1546 (ToArray (StreamMap __iruid_1547 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1547) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1544) (MakeStruct (s (GetField s (Ref __iruid_1547))))))))) (Let __iruid_1548 (MakeStruct (rows (ToArray (StreamMap __iruid_1549 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1546)) (I32 1)) (Let __iruid_1550 (ArrayRef -1 (Ref __iruid_1546) (Ref __iruid_1549)) (InsertFields (SelectFields () (Ref __iruid_1550)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1550)))))))))) (Let __iruid_1551 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1552 (CollectDistributedArray table_aggregate_singlestage __iruid_1553 __iruid_1554 (Let __iruid_1555 (ArrayLen (GetField rows (Ref __iruid_1548))) (Let __iruid_1556 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1557 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1557) (Ref __iruid_1555)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1555))) (Let __iruid_1558 (GetField rows (Ref __iruid_1548)) (StreamMap __iruid_1559 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1560 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1556) (Ref __iruid_1559)) (ArrayRef -1 (Ref __iruid_1556) (ApplyBinaryPrimOp Add (Ref __iruid_1559) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1558) (Ref __iruid_1560)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1551))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1554))))) (StreamFor __iruid_1561 (ToStream True (Ref __iruid_1553)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1561))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1551)))) (StreamFor __iruid_1562 (ToStream True (Ref __iruid_1552)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1562)))))) (Let __iruid_1563 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1563))))))))) 2023-04-22 21:13:41.037 : INFO: after LowerOrInterpretNonCompilable: IR size 132: (Let __iruid_1544 (ToDict (StreamMap __iruid_1545 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1545)) (SelectFields (scores) (Ref __iruid_1545))))) (Let __iruid_1546 (ToArray (StreamMap __iruid_1547 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1547) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1544) (MakeStruct (s (GetField s (Ref __iruid_1547))))))))) (Let __iruid_1548 (MakeStruct (rows (ToArray (StreamMap __iruid_1549 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1546)) (I32 1)) (Let __iruid_1550 (ArrayRef -1 (Ref __iruid_1546) (Ref __iruid_1549)) (InsertFields (SelectFields () (Ref __iruid_1550)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1550)))))))))) (Let __iruid_1551 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1552 (CollectDistributedArray table_aggregate_singlestage __iruid_1553 __iruid_1554 (Let __iruid_1555 (ArrayLen (GetField rows (Ref __iruid_1548))) (Let __iruid_1556 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1557 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1557) (Ref __iruid_1555)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1555))) (Let __iruid_1558 (GetField rows (Ref __iruid_1548)) (StreamMap __iruid_1559 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1560 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1556) (Ref __iruid_1559)) (ArrayRef -1 (Ref __iruid_1556) (ApplyBinaryPrimOp Add (Ref __iruid_1559) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1558) (Ref __iruid_1560)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1551))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1554))))) (StreamFor __iruid_1561 (ToStream True (Ref __iruid_1553)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1561))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1551)))) (StreamFor __iruid_1562 (ToStream True (Ref __iruid_1552)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1562)))))) (Let __iruid_1563 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1563))))))))) 2023-04-22 21:13:41.127 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 132: (Let __iruid_1676 (ToDict (StreamMap __iruid_1677 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1677)) (SelectFields (scores) (Ref __iruid_1677))))) (Let __iruid_1678 (ToArray (StreamMap __iruid_1679 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1679) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1676) (MakeStruct (s (GetField s (Ref __iruid_1679))))))))) (Let __iruid_1680 (MakeStruct (rows (ToArray (StreamMap __iruid_1681 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1678)) (I32 1)) (Let __iruid_1682 (ArrayRef -1 (Ref __iruid_1678) (Ref __iruid_1681)) (InsertFields (SelectFields () (Ref __iruid_1682)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1682)))))))))) (Let __iruid_1683 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1684 (CollectDistributedArray table_aggregate_singlestage __iruid_1685 __iruid_1686 (Let __iruid_1687 (ArrayLen (GetField rows (Ref __iruid_1680))) (Let __iruid_1688 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1689 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1689) (Ref __iruid_1687)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1687))) (Let __iruid_1690 (GetField rows (Ref __iruid_1680)) (StreamMap __iruid_1691 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1692 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1688) (Ref __iruid_1691)) (ArrayRef -1 (Ref __iruid_1688) (ApplyBinaryPrimOp Add (Ref __iruid_1691) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1690) (Ref __iruid_1692)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1683))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1686))))) (StreamFor __iruid_1693 (ToStream True (Ref __iruid_1685)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1693))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1683)))) (StreamFor __iruid_1694 (ToStream True (Ref __iruid_1684)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1694)))))) (Let __iruid_1695 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1695))))))))) 2023-04-22 21:13:41.150 : INFO: initial IR: IR size 133: (MakeTuple (0) (Let __iruid_1676 (ToDict (StreamMap __iruid_1677 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1677)) (SelectFields (scores) (Ref __iruid_1677))))) (Let __iruid_1678 (ToArray (StreamMap __iruid_1679 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1679) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1676) (MakeStruct (s (GetField s (Ref __iruid_1679))))))))) (Let __iruid_1680 (MakeStruct (rows (ToArray (StreamMap __iruid_1681 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1678)) (I32 1)) (Let __iruid_1682 (ArrayRef -1 (Ref __iruid_1678) (Ref __iruid_1681)) (InsertFields (SelectFields () (Ref __iruid_1682)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1682)))))))))) (Let __iruid_1683 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1684 (CollectDistributedArray table_aggregate_singlestage __iruid_1685 __iruid_1686 (Let __iruid_1687 (ArrayLen (GetField rows (Ref __iruid_1680))) (Let __iruid_1688 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1689 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1689) (Ref __iruid_1687)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1687))) (Let __iruid_1690 (GetField rows (Ref __iruid_1680)) (StreamMap __iruid_1691 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1692 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1688) (Ref __iruid_1691)) (ArrayRef -1 (Ref __iruid_1688) (ApplyBinaryPrimOp Add (Ref __iruid_1691) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1690) (Ref __iruid_1692)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1683))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1686))))) (StreamFor __iruid_1693 (ToStream True (Ref __iruid_1685)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1693))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1683)))) (StreamFor __iruid_1694 (ToStream True (Ref __iruid_1684)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1694)))))) (Let __iruid_1695 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1695)))))))))) 2023-04-22 21:13:41.242 : INFO: after optimize: compileLowerer, initial IR: IR size 133: (MakeTuple (0) (Let __iruid_1808 (ToDict (StreamMap __iruid_1809 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1809)) (SelectFields (scores) (Ref __iruid_1809))))) (Let __iruid_1810 (ToArray (StreamMap __iruid_1811 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1811) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1808) (MakeStruct (s (GetField s (Ref __iruid_1811))))))))) (Let __iruid_1812 (MakeStruct (rows (ToArray (StreamMap __iruid_1813 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1810)) (I32 1)) (Let __iruid_1814 (ArrayRef -1 (Ref __iruid_1810) (Ref __iruid_1813)) (InsertFields (SelectFields () (Ref __iruid_1814)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1814)))))))))) (Let __iruid_1815 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1816 (CollectDistributedArray table_aggregate_singlestage __iruid_1817 __iruid_1818 (Let __iruid_1819 (ArrayLen (GetField rows (Ref __iruid_1812))) (Let __iruid_1820 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1821 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1821) (Ref __iruid_1819)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1819))) (Let __iruid_1822 (GetField rows (Ref __iruid_1812)) (StreamMap __iruid_1823 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1824 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1820) (Ref __iruid_1823)) (ArrayRef -1 (Ref __iruid_1820) (ApplyBinaryPrimOp Add (Ref __iruid_1823) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1822) (Ref __iruid_1824)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1815))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1818))))) (StreamFor __iruid_1825 (ToStream True (Ref __iruid_1817)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1825))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1815)))) (StreamFor __iruid_1826 (ToStream True (Ref __iruid_1816)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1826)))))) (Let __iruid_1827 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1827)))))))))) 2023-04-22 21:13:41.247 : INFO: after InlineApplyIR: IR size 179: (MakeTuple (0) (Let __iruid_1808 (ToDict (StreamMap __iruid_1809 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1809)) (SelectFields (scores) (Ref __iruid_1809))))) (Let __iruid_1810 (ToArray (StreamMap __iruid_1811 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1811) None (__uid_4 (Let __iruid_1840 (Ref __iruid_1808) (Let __iruid_1841 (MakeStruct (s (GetField s (Ref __iruid_1811)))) (If (IsNA (Ref __iruid_1840)) (NA Struct{scores:Array[Float64]}) (Let __iruid_1842 (LowerBoundOnOrderedCollection True (Ref __iruid_1840) (Ref __iruid_1841)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_1842) (ArrayLen (CastToArray (Ref __iruid_1840)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_1840)) (Ref __iruid_1842))) (Ref __iruid_1841)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_1840)) (Ref __iruid_1842))) (NA Struct{scores:Array[Float64]}))))))))))) (Let __iruid_1812 (MakeStruct (rows (ToArray (StreamMap __iruid_1813 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1810)) (I32 1)) (Let __iruid_1814 (ArrayRef -1 (Ref __iruid_1810) (Ref __iruid_1813)) (InsertFields (SelectFields () (Ref __iruid_1814)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1814)))))))))) (Let __iruid_1815 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1816 (CollectDistributedArray table_aggregate_singlestage __iruid_1817 __iruid_1818 (Let __iruid_1819 (ArrayLen (GetField rows (Ref __iruid_1812))) (Let __iruid_1820 (Let __iruid_1843 (ToArray (StreamMap __iruid_1821 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1821) (Ref __iruid_1819)) (I32 16)))) (Let __iruid_1844 (MakeArray Array[Int32] (Ref __iruid_1819)) (If (IsNA (Ref __iruid_1843)) (NA Array[Int32]) (If (IsNA (Ref __iruid_1844)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_1845 (MakeStream Stream[Array[Int32]] False (Ref __iruid_1843) (Ref __iruid_1844)) (ToStream False (Ref __iruid_1845)))))))) (Let __iruid_1822 (GetField rows (Ref __iruid_1812)) (StreamMap __iruid_1823 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1824 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1820) (Ref __iruid_1823)) (ArrayRef -1 (Ref __iruid_1820) (ApplyBinaryPrimOp Add (Ref __iruid_1823) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1822) (Ref __iruid_1824)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1815))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1818))))) (StreamFor __iruid_1825 (ToStream True (Ref __iruid_1817)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1825))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1815)))) (StreamFor __iruid_1826 (ToStream True (Ref __iruid_1816)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1826)))))) (Let __iruid_1827 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1827)))))))))) 2023-04-22 21:13:41.363 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 177: (MakeTuple (0) (Let __iruid_1897 (ToDict (StreamMap __iruid_1898 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1898)) (SelectFields (scores) (Ref __iruid_1898))))) (Let __iruid_1899 (ToArray (StreamMap __iruid_1900 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1900) None (__uid_4 (Let __iruid_1901 (MakeStruct (s (GetField s (Ref __iruid_1900)))) (If (IsNA (Ref __iruid_1897)) (NA Struct{scores:Array[Float64]}) (Let __iruid_1902 (LowerBoundOnOrderedCollection True (Ref __iruid_1897) (Ref __iruid_1901)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_1902) (ArrayLen (CastToArray (Ref __iruid_1897)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_1897)) (Ref __iruid_1902))) (Ref __iruid_1901)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_1897)) (Ref __iruid_1902))) (NA Struct{scores:Array[Float64]})))))))))) (Let __iruid_1903 (MakeStruct (rows (ToArray (StreamMap __iruid_1904 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1899)) (I32 1)) (Let __iruid_1905 (ArrayRef -1 (Ref __iruid_1899) (Ref __iruid_1904)) (InsertFields (SelectFields () (Ref __iruid_1905)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1905)))))))))) (Let __iruid_1906 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1907 (CollectDistributedArray table_aggregate_singlestage __iruid_1908 __iruid_1909 (Let __iruid_1910 (ArrayLen (GetField rows (Ref __iruid_1903))) (Let __iruid_1911 (ToArray (StreamMap __iruid_1912 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1912) (Ref __iruid_1910)) (I32 16)))) (Let __iruid_1913 (MakeArray Array[Int32] (Ref __iruid_1910)) (Let __iruid_1914 (If (IsNA (Ref __iruid_1911)) (NA Array[Int32]) (If (IsNA (Ref __iruid_1913)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_1915 (MakeStream Stream[Array[Int32]] False (Ref __iruid_1911) (Ref __iruid_1913)) (ToStream False (Ref __iruid_1915)))))) (Let __iruid_1916 (GetField rows (Ref __iruid_1903)) (StreamMap __iruid_1917 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1918 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1914) (Ref __iruid_1917)) (ArrayRef -1 (Ref __iruid_1914) (ApplyBinaryPrimOp Add (Ref __iruid_1917) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1916) (Ref __iruid_1918)))))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1906))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1909))))) (StreamFor __iruid_1919 (ToStream True (Ref __iruid_1908)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1919))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1906)))) (StreamFor __iruid_1920 (ToStream True (Ref __iruid_1907)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1920)))))) (Let __iruid_1921 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1921)))))))))) 2023-04-22 21:13:41.385 : INFO: after LowerArrayAggsToRunAggs: IR size 177: (MakeTuple (0) (Let __iruid_1897 (ToDict (StreamMap __iruid_1898 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1898)) (SelectFields (scores) (Ref __iruid_1898))))) (Let __iruid_1899 (ToArray (StreamMap __iruid_1900 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1900) None (__uid_4 (Let __iruid_1901 (MakeStruct (s (GetField s (Ref __iruid_1900)))) (If (IsNA (Ref __iruid_1897)) (NA Struct{scores:Array[Float64]}) (Let __iruid_1902 (LowerBoundOnOrderedCollection True (Ref __iruid_1897) (Ref __iruid_1901)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_1902) (ArrayLen (CastToArray (Ref __iruid_1897)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_1897)) (Ref __iruid_1902))) (Ref __iruid_1901)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_1897)) (Ref __iruid_1902))) (NA Struct{scores:Array[Float64]})))))))))) (Let __iruid_1903 (MakeStruct (rows (ToArray (StreamMap __iruid_1904 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1899)) (I32 1)) (Let __iruid_1905 (ArrayRef -1 (Ref __iruid_1899) (Ref __iruid_1904)) (InsertFields (SelectFields () (Ref __iruid_1905)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1905)))))))))) (Let __iruid_1906 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1907 (CollectDistributedArray table_aggregate_singlestage __iruid_1908 __iruid_1909 (Let __iruid_1910 (ArrayLen (GetField rows (Ref __iruid_1903))) (Let __iruid_1911 (ToArray (StreamMap __iruid_1912 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1912) (Ref __iruid_1910)) (I32 16)))) (Let __iruid_1913 (MakeArray Array[Int32] (Ref __iruid_1910)) (Let __iruid_1914 (If (IsNA (Ref __iruid_1911)) (NA Array[Int32]) (If (IsNA (Ref __iruid_1913)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_1915 (MakeStream Stream[Array[Int32]] False (Ref __iruid_1911) (Ref __iruid_1913)) (ToStream False (Ref __iruid_1915)))))) (Let __iruid_1916 (GetField rows (Ref __iruid_1903)) (StreamMap __iruid_1917 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1918 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1914) (Ref __iruid_1917)) (ArrayRef -1 (Ref __iruid_1914) (ApplyBinaryPrimOp Add (Ref __iruid_1917) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1916) (Ref __iruid_1918)))))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1906))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1909))))) (StreamFor __iruid_1919 (ToStream True (Ref __iruid_1908)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1919))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1906)))) (StreamFor __iruid_1920 (ToStream True (Ref __iruid_1907)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1920)))))) (Let __iruid_1921 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1921)))))))))) 2023-04-22 21:13:41.502 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 177: (MakeTuple (0) (Let __iruid_1972 (ToDict (StreamMap __iruid_1973 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1973)) (SelectFields (scores) (Ref __iruid_1973))))) (Let __iruid_1974 (ToArray (StreamMap __iruid_1975 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1975) None (__uid_4 (Let __iruid_1976 (MakeStruct (s (GetField s (Ref __iruid_1975)))) (If (IsNA (Ref __iruid_1972)) (NA Struct{scores:Array[Float64]}) (Let __iruid_1977 (LowerBoundOnOrderedCollection True (Ref __iruid_1972) (Ref __iruid_1976)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_1977) (ArrayLen (CastToArray (Ref __iruid_1972)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_1972)) (Ref __iruid_1977))) (Ref __iruid_1976)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_1972)) (Ref __iruid_1977))) (NA Struct{scores:Array[Float64]})))))))))) (Let __iruid_1978 (MakeStruct (rows (ToArray (StreamMap __iruid_1979 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1974)) (I32 1)) (Let __iruid_1980 (ArrayRef -1 (Ref __iruid_1974) (Ref __iruid_1979)) (InsertFields (SelectFields () (Ref __iruid_1980)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1980)))))))))) (Let __iruid_1981 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1982 (CollectDistributedArray table_aggregate_singlestage __iruid_1983 __iruid_1984 (Let __iruid_1985 (ArrayLen (GetField rows (Ref __iruid_1978))) (Let __iruid_1986 (ToArray (StreamMap __iruid_1987 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1987) (Ref __iruid_1985)) (I32 16)))) (Let __iruid_1988 (MakeArray Array[Int32] (Ref __iruid_1985)) (Let __iruid_1989 (If (IsNA (Ref __iruid_1986)) (NA Array[Int32]) (If (IsNA (Ref __iruid_1988)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_1990 (MakeStream Stream[Array[Int32]] False (Ref __iruid_1986) (Ref __iruid_1988)) (ToStream False (Ref __iruid_1990)))))) (Let __iruid_1991 (GetField rows (Ref __iruid_1978)) (StreamMap __iruid_1992 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1993 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1989) (Ref __iruid_1992)) (ArrayRef -1 (Ref __iruid_1989) (ApplyBinaryPrimOp Add (Ref __iruid_1992) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1991) (Ref __iruid_1993)))))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1981))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1984))))) (StreamFor __iruid_1994 (ToStream True (Ref __iruid_1983)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1994))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1981)))) (StreamFor __iruid_1995 (ToStream True (Ref __iruid_1982)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1995)))))) (Let __iruid_1996 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1996)))))))))) 2023-04-22 21:13:41.741 : INFO: instruction count: 3: __C1662HailClassLoaderContainer. 2023-04-22 21:13:41.741 : INFO: instruction count: 3: __C1662HailClassLoaderContainer. 2023-04-22 21:13:41.741 : INFO: instruction count: 3: __C1664FSContainer. 2023-04-22 21:13:41.741 : INFO: instruction count: 3: __C1664FSContainer. 2023-04-22 21:13:41.745 : INFO: instruction count: 3: __C1666collect_distributed_array_table_aggregate_singlestage. 2023-04-22 21:13:41.746 : INFO: instruction count: 282: __C1666collect_distributed_array_table_aggregate_singlestage.apply 2023-04-22 21:13:41.746 : INFO: instruction count: 17: __C1666collect_distributed_array_table_aggregate_singlestage.apply 2023-04-22 21:13:41.746 : INFO: instruction count: 58: __C1666collect_distributed_array_table_aggregate_singlestage.__m1668DECODE_r_struct_of_o_array_of_r_struct_of_o_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:41.746 : INFO: instruction count: 58: __C1666collect_distributed_array_table_aggregate_singlestage.__m1669INPLACE_DECODE_o_array_of_r_struct_of_o_array_of_r_float64END_TO_o_array_of_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:41.746 : INFO: instruction count: 48: __C1666collect_distributed_array_table_aggregate_singlestage.__m1670INPLACE_DECODE_r_struct_of_o_array_of_r_float64END_TO_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:41.746 : INFO: instruction count: 58: __C1666collect_distributed_array_table_aggregate_singlestage.__m1671INPLACE_DECODE_o_array_of_r_float64_TO_o_array_of_r_float64 2023-04-22 21:13:41.746 : INFO: instruction count: 10: __C1666collect_distributed_array_table_aggregate_singlestage.__m1672INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:41.746 : INFO: instruction count: 27: __C1666collect_distributed_array_table_aggregate_singlestage.__m1677DECODE_r_struct_of_r_struct_of_r_struct_of_r_binaryENDENDEND_TO_SBaseStructPointer 2023-04-22 21:13:41.746 : INFO: instruction count: 17: __C1666collect_distributed_array_table_aggregate_singlestage.__m1678INPLACE_DECODE_r_struct_of_r_struct_of_r_binaryENDEND_TO_r_struct_of_r_tuple_of_r_binaryENDEND 2023-04-22 21:13:41.746 : INFO: instruction count: 17: __C1666collect_distributed_array_table_aggregate_singlestage.__m1679INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_tuple_of_r_binaryEND 2023-04-22 21:13:41.746 : INFO: instruction count: 31: __C1666collect_distributed_array_table_aggregate_singlestage.__m1680INPLACE_DECODE_r_binary_TO_r_binary 2023-04-22 21:13:41.746 : INFO: instruction count: 9: __C1666collect_distributed_array_table_aggregate_singlestage.__m1686begin_group_0 2023-04-22 21:13:41.746 : INFO: instruction count: 76: __C1666collect_distributed_array_table_aggregate_singlestage.__m1687begin_group_0 2023-04-22 21:13:41.746 : INFO: instruction count: 11: __C1666collect_distributed_array_table_aggregate_singlestage.__m1690setup_null 2023-04-22 21:13:41.746 : INFO: instruction count: 73: __C1666collect_distributed_array_table_aggregate_singlestage.__m1691split_StreamFor 2023-04-22 21:13:41.747 : INFO: instruction count: 55: __C1666collect_distributed_array_table_aggregate_singlestage.__m1699begin_group_0 2023-04-22 21:13:41.747 : INFO: instruction count: 5: __C1666collect_distributed_array_table_aggregate_singlestage.__m1700toInt64 2023-04-22 21:13:41.747 : INFO: instruction count: 11: __C1666collect_distributed_array_table_aggregate_singlestage.__m1704setup_null 2023-04-22 21:13:41.747 : INFO: instruction count: 9: __C1666collect_distributed_array_table_aggregate_singlestage.__m1707ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryENDEND 2023-04-22 21:13:41.747 : INFO: instruction count: 13: __C1666collect_distributed_array_table_aggregate_singlestage.__m1708ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryEND 2023-04-22 21:13:41.747 : INFO: instruction count: 16: __C1666collect_distributed_array_table_aggregate_singlestage.__m1709ENCODE_SBinaryPointer_TO_r_binary 2023-04-22 21:13:41.747 : INFO: instruction count: 9: __C1666collect_distributed_array_table_aggregate_singlestage.setPartitionIndex 2023-04-22 21:13:41.747 : INFO: instruction count: 4: __C1666collect_distributed_array_table_aggregate_singlestage.addPartitionRegion 2023-04-22 21:13:41.747 : INFO: instruction count: 4: __C1666collect_distributed_array_table_aggregate_singlestage.setPool 2023-04-22 21:13:41.762 : INFO: instruction count: 3: __C1666collect_distributed_array_table_aggregate_singlestage.addHailClassLoader 2023-04-22 21:13:41.762 : INFO: instruction count: 3: __C1666collect_distributed_array_table_aggregate_singlestage.addFS 2023-04-22 21:13:41.762 : INFO: instruction count: 4: __C1666collect_distributed_array_table_aggregate_singlestage.addTaskContext 2023-04-22 21:13:41.776 : INFO: encoder cache hit 2023-04-22 21:13:41.777 MemoryStore: INFO: Block broadcast_168 stored as values in memory (estimated size 435.4 KiB, free 25.1 GiB) 2023-04-22 21:13:41.795 MemoryStore: INFO: Block broadcast_168_piece0 stored as bytes in memory (estimated size 368.6 KiB, free 25.1 GiB) 2023-04-22 21:13:41.795 BlockManagerInfo: INFO: Added broadcast_168_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 368.6 KiB, free: 25.3 GiB) 2023-04-22 21:13:41.810 SparkContext: INFO: Created broadcast 168 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:41.810 : INFO: instruction count: 3: __C1497HailClassLoaderContainer. 2023-04-22 21:13:41.810 : INFO: instruction count: 3: __C1497HailClassLoaderContainer. 2023-04-22 21:13:41.810 : INFO: instruction count: 3: __C1499FSContainer. 2023-04-22 21:13:41.810 : INFO: instruction count: 3: __C1499FSContainer. 2023-04-22 21:13:41.873 : INFO: instruction count: 3: __C1501Compiled. 2023-04-22 21:13:41.873 : INFO: instruction count: 59: __C1501Compiled.apply 2023-04-22 21:13:41.873 : INFO: instruction count: 377: __C1501Compiled.__m1503split_ToDict 2023-04-22 21:13:41.873 : INFO: instruction count: 12: __C1501Compiled.__m1513setup_jab 2023-04-22 21:13:41.874 : INFO: instruction count: 202: __C1501Compiled.__m1516arraySorter_outer 2023-04-22 21:13:41.874 : INFO: instruction count: 93: __C1501Compiled.__m1517arraySorter_merge 2023-04-22 21:13:41.874 : INFO: instruction count: 11: __C1501Compiled.__m1520ord_lt 2023-04-22 21:13:41.874 : INFO: instruction count: 52: __C1501Compiled.__m1521ord_ltNonnull 2023-04-22 21:13:41.874 : INFO: instruction count: 11: __C1501Compiled.__m1522ord_lt 2023-04-22 21:13:41.874 : INFO: instruction count: 21: __C1501Compiled.__m1523ord_ltNonnull 2023-04-22 21:13:41.874 : INFO: instruction count: 9: __C1501Compiled.__m1524ord_compareNonnull 2023-04-22 21:13:41.874 : INFO: instruction count: 89: __C1501Compiled.__m1525ord_compareNonnull 2023-04-22 21:13:41.874 : INFO: instruction count: 11: __C1501Compiled.__m1526ord_equiv 2023-04-22 21:13:41.874 : INFO: instruction count: 21: __C1501Compiled.__m1527ord_equivNonnull 2023-04-22 21:13:41.874 : INFO: instruction count: 36: __C1501Compiled.__m1528arraySorter_splitMerge 2023-04-22 21:13:41.875 : INFO: instruction count: 264: __C1501Compiled.__m1529distinctFromSorted 2023-04-22 21:13:41.875 : INFO: instruction count: 30: __C1501Compiled.__m1532ord_equiv 2023-04-22 21:13:41.875 : INFO: instruction count: 39: __C1501Compiled.__m1533ord_equivNonnull 2023-04-22 21:13:41.875 : INFO: instruction count: 211: __C1501Compiled.__m1541split_ToArray 2023-04-22 21:13:41.875 : INFO: instruction count: 146: __C1501Compiled.__m1549split_Let 2023-04-22 21:13:41.875 : INFO: instruction count: 60: __C1501Compiled.__m1552findElt 2023-04-22 21:13:41.875 : INFO: instruction count: 11: __C1501Compiled.__m1553ord_lt 2023-04-22 21:13:41.875 : INFO: instruction count: 44: __C1501Compiled.__m1554ord_ltNonnull 2023-04-22 21:13:41.875 : INFO: instruction count: 11: __C1501Compiled.__m1556ord_equiv 2023-04-22 21:13:41.875 : INFO: instruction count: 14: __C1501Compiled.__m1557ord_equivNonnull 2023-04-22 21:13:41.875 : INFO: instruction count: 35: __C1501Compiled.__m1558arrayref_bounds_check 2023-04-22 21:13:41.876 : INFO: instruction count: 11: __C1501Compiled.__m1559ord_equiv 2023-04-22 21:13:41.876 : INFO: instruction count: 31: __C1501Compiled.__m1560ord_equivNonnull 2023-04-22 21:13:41.876 : INFO: instruction count: 245: __C1501Compiled.__m1569split_Let 2023-04-22 21:13:41.876 : INFO: instruction count: 255: __C1501Compiled.__m1571split_ToArray 2023-04-22 21:13:41.877 : INFO: instruction count: 9: __C1501Compiled.__m1589begin_group_0 2023-04-22 21:13:41.877 : INFO: instruction count: 11: __C1501Compiled.__m1592setup_null 2023-04-22 21:13:41.877 : INFO: instruction count: 503: __C1501Compiled.__m1596split_CollectDistributedArray 2023-04-22 21:13:41.877 : INFO: instruction count: 177: __C1501Compiled.__m1599split_ToArray 2023-04-22 21:13:41.878 : INFO: instruction count: 295: __C1501Compiled.__m1615split_ToArray 2023-04-22 21:13:41.878 : INFO: instruction count: 12: __C1501Compiled.__m1635setup_iab 2023-04-22 21:13:41.878 : INFO: instruction count: 255: __C1501Compiled.__m1651split_ToArray 2023-04-22 21:13:41.878 : INFO: instruction count: 4: __C1501Compiled.setBackend 2023-04-22 21:13:41.878 : INFO: instruction count: 48: __C1501Compiled.__m1718ENCODE_SBaseStructPointer_TO_r_struct_of_o_array_of_r_struct_of_o_array_of_r_float64ENDEND 2023-04-22 21:13:41.878 : INFO: instruction count: 35: __C1501Compiled.__m1719ENCODE_SIndexablePointer_TO_o_array_of_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:41.878 : INFO: instruction count: 48: __C1501Compiled.__m1720ENCODE_SBaseStructPointer_TO_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:41.878 : INFO: instruction count: 39: __C1501Compiled.__m1721ENCODE_SIndexablePointer_TO_o_array_of_r_float64 2023-04-22 21:13:41.878 : INFO: instruction count: 4: __C1501Compiled.__m1722ENCODE_SFloat64$_TO_r_float64 2023-04-22 21:13:41.878 : INFO: instruction count: 9: __C1501Compiled.__m1723ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_binaryENDENDEND 2023-04-22 21:13:41.878 : INFO: instruction count: 9: __C1501Compiled.__m1724ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryENDEND 2023-04-22 21:13:41.878 : INFO: instruction count: 13: __C1501Compiled.__m1725ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryEND 2023-04-22 21:13:41.878 : INFO: instruction count: 16: __C1501Compiled.__m1726ENCODE_SBinaryPointer_TO_r_binary 2023-04-22 21:13:41.878 : INFO: instruction count: 27: __C1501Compiled.__m1729DECODE_r_struct_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:41.878 : INFO: instruction count: 17: __C1501Compiled.__m1730INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_tuple_of_r_binaryEND 2023-04-22 21:13:41.878 : INFO: instruction count: 31: __C1501Compiled.__m1731INPLACE_DECODE_r_binary_TO_r_binary 2023-04-22 21:13:41.879 : INFO: instruction count: 9: __C1501Compiled.__m1741begin_group_0 2023-04-22 21:13:41.879 : INFO: instruction count: 64: __C1501Compiled.__m1742begin_group_0 2023-04-22 21:13:41.879 : INFO: instruction count: 11: __C1501Compiled.__m1745setup_null 2023-04-22 21:13:41.879 : INFO: instruction count: 68: __C1501Compiled.__m1746split_StreamFor 2023-04-22 21:13:41.879 : INFO: instruction count: 84: __C1501Compiled.__m1754begin_group_0 2023-04-22 21:13:41.879 : INFO: instruction count: 9: __C1501Compiled.setPartitionIndex 2023-04-22 21:13:41.879 : INFO: instruction count: 4: __C1501Compiled.addPartitionRegion 2023-04-22 21:13:41.879 : INFO: instruction count: 4: __C1501Compiled.setPool 2023-04-22 21:13:41.879 : INFO: instruction count: 3: __C1501Compiled.addHailClassLoader 2023-04-22 21:13:41.879 : INFO: instruction count: 3: __C1501Compiled.addFS 2023-04-22 21:13:41.879 : INFO: instruction count: 4: __C1501Compiled.addTaskContext 2023-04-22 21:13:41.879 : INFO: instruction count: 3: __C1501Compiled.setObjects 2023-04-22 21:13:41.879 : INFO: instruction count: 96: __C1501Compiled.addAndDecodeLiterals 2023-04-22 21:13:41.879 : INFO: instruction count: 18: __C1501Compiled.__m1763DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:41.879 : INFO: instruction count: 27: __C1501Compiled.__m1764DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:41.879 : INFO: instruction count: 58: __C1501Compiled.__m1765INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:13:41.879 : INFO: instruction count: 17: __C1501Compiled.__m1766INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:13:41.879 : INFO: instruction count: 31: __C1501Compiled.__m1767INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:41.879 : INFO: instruction count: 27: __C1501Compiled.__m1768DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:41.879 : INFO: instruction count: 58: __C1501Compiled.__m1769INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:41.879 : INFO: instruction count: 26: __C1501Compiled.__m1770INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:41.880 : INFO: instruction count: 58: __C1501Compiled.__m1771INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:41.880 : INFO: instruction count: 10: __C1501Compiled.__m1772INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:41.880 : INFO: instruction count: 3: __C1727staticWrapperClass_1. 2023-04-22 21:13:41.881 : INFO: initial IR: IR size 133: (MakeTuple (0) (Let __iruid_1676 (ToDict (StreamMap __iruid_1677 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1677)) (SelectFields (scores) (Ref __iruid_1677))))) (Let __iruid_1678 (ToArray (StreamMap __iruid_1679 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1679) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1676) (MakeStruct (s (GetField s (Ref __iruid_1679))))))))) (Let __iruid_1680 (MakeStruct (rows (ToArray (StreamMap __iruid_1681 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1678)) (I32 1)) (Let __iruid_1682 (ArrayRef -1 (Ref __iruid_1678) (Ref __iruid_1681)) (InsertFields (SelectFields () (Ref __iruid_1682)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1682)))))))))) (Let __iruid_1683 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1684 (CollectDistributedArray table_aggregate_singlestage __iruid_1685 __iruid_1686 (Let __iruid_1687 (ArrayLen (GetField rows (Ref __iruid_1680))) (Let __iruid_1688 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1689 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1689) (Ref __iruid_1687)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1687))) (Let __iruid_1690 (GetField rows (Ref __iruid_1680)) (StreamMap __iruid_1691 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1692 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1688) (Ref __iruid_1691)) (ArrayRef -1 (Ref __iruid_1688) (ApplyBinaryPrimOp Add (Ref __iruid_1691) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1690) (Ref __iruid_1692)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1683))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1686))))) (StreamFor __iruid_1693 (ToStream True (Ref __iruid_1685)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1693))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1683)))) (StreamFor __iruid_1694 (ToStream True (Ref __iruid_1684)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1694)))))) (Let __iruid_1695 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1695)))))))))) 2023-04-22 21:13:41.955 : INFO: after optimize: compileLowerer, initial IR: IR size 133: (MakeTuple (0) (Let __iruid_2092 (ToDict (StreamMap __iruid_2093 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_2093)) (SelectFields (scores) (Ref __iruid_2093))))) (Let __iruid_2094 (ToArray (StreamMap __iruid_2095 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_2095) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_2092) (MakeStruct (s (GetField s (Ref __iruid_2095))))))))) (Let __iruid_2096 (MakeStruct (rows (ToArray (StreamMap __iruid_2097 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2094)) (I32 1)) (Let __iruid_2098 (ArrayRef -1 (Ref __iruid_2094) (Ref __iruid_2097)) (InsertFields (SelectFields () (Ref __iruid_2098)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_2098)))))))))) (Let __iruid_2099 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_2100 (CollectDistributedArray table_aggregate_singlestage __iruid_2101 __iruid_2102 (Let __iruid_2103 (ArrayLen (GetField rows (Ref __iruid_2096))) (Let __iruid_2104 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_2105 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_2105) (Ref __iruid_2103)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_2103))) (Let __iruid_2106 (GetField rows (Ref __iruid_2096)) (StreamMap __iruid_2107 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_2108 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_2104) (Ref __iruid_2107)) (ArrayRef -1 (Ref __iruid_2104) (ApplyBinaryPrimOp Add (Ref __iruid_2107) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_2106) (Ref __iruid_2108)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_2099))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_2102))))) (StreamFor __iruid_2109 (ToStream True (Ref __iruid_2101)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_2109))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_2099)))) (StreamFor __iruid_2110 (ToStream True (Ref __iruid_2100)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_2110)))))) (Let __iruid_2111 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_2111)))))))))) 2023-04-22 21:13:41.959 : INFO: after InlineApplyIR: IR size 179: (MakeTuple (0) (Let __iruid_2092 (ToDict (StreamMap __iruid_2093 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_2093)) (SelectFields (scores) (Ref __iruid_2093))))) (Let __iruid_2094 (ToArray (StreamMap __iruid_2095 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_2095) None (__uid_4 (Let __iruid_2124 (Ref __iruid_2092) (Let __iruid_2125 (MakeStruct (s (GetField s (Ref __iruid_2095)))) (If (IsNA (Ref __iruid_2124)) (NA Struct{scores:Array[Float64]}) (Let __iruid_2126 (LowerBoundOnOrderedCollection True (Ref __iruid_2124) (Ref __iruid_2125)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_2126) (ArrayLen (CastToArray (Ref __iruid_2124)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_2124)) (Ref __iruid_2126))) (Ref __iruid_2125)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_2124)) (Ref __iruid_2126))) (NA Struct{scores:Array[Float64]}))))))))))) (Let __iruid_2096 (MakeStruct (rows (ToArray (StreamMap __iruid_2097 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2094)) (I32 1)) (Let __iruid_2098 (ArrayRef -1 (Ref __iruid_2094) (Ref __iruid_2097)) (InsertFields (SelectFields () (Ref __iruid_2098)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_2098)))))))))) (Let __iruid_2099 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_2100 (CollectDistributedArray table_aggregate_singlestage __iruid_2101 __iruid_2102 (Let __iruid_2103 (ArrayLen (GetField rows (Ref __iruid_2096))) (Let __iruid_2104 (Let __iruid_2127 (ToArray (StreamMap __iruid_2105 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_2105) (Ref __iruid_2103)) (I32 16)))) (Let __iruid_2128 (MakeArray Array[Int32] (Ref __iruid_2103)) (If (IsNA (Ref __iruid_2127)) (NA Array[Int32]) (If (IsNA (Ref __iruid_2128)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_2129 (MakeStream Stream[Array[Int32]] False (Ref __iruid_2127) (Ref __iruid_2128)) (ToStream False (Ref __iruid_2129)))))))) (Let __iruid_2106 (GetField rows (Ref __iruid_2096)) (StreamMap __iruid_2107 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_2108 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_2104) (Ref __iruid_2107)) (ArrayRef -1 (Ref __iruid_2104) (ApplyBinaryPrimOp Add (Ref __iruid_2107) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_2106) (Ref __iruid_2108)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_2099))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_2102))))) (StreamFor __iruid_2109 (ToStream True (Ref __iruid_2101)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_2109))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_2099)))) (StreamFor __iruid_2110 (ToStream True (Ref __iruid_2100)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_2110)))))) (Let __iruid_2111 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_2111)))))))))) 2023-04-22 21:13:42.037 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 177: (MakeTuple (0) (Let __iruid_2181 (ToDict (StreamMap __iruid_2182 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_2182)) (SelectFields (scores) (Ref __iruid_2182))))) (Let __iruid_2183 (ToArray (StreamMap __iruid_2184 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_2184) None (__uid_4 (Let __iruid_2185 (MakeStruct (s (GetField s (Ref __iruid_2184)))) (If (IsNA (Ref __iruid_2181)) (NA Struct{scores:Array[Float64]}) (Let __iruid_2186 (LowerBoundOnOrderedCollection True (Ref __iruid_2181) (Ref __iruid_2185)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_2186) (ArrayLen (CastToArray (Ref __iruid_2181)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_2181)) (Ref __iruid_2186))) (Ref __iruid_2185)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_2181)) (Ref __iruid_2186))) (NA Struct{scores:Array[Float64]})))))))))) (Let __iruid_2187 (MakeStruct (rows (ToArray (StreamMap __iruid_2188 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2183)) (I32 1)) (Let __iruid_2189 (ArrayRef -1 (Ref __iruid_2183) (Ref __iruid_2188)) (InsertFields (SelectFields () (Ref __iruid_2189)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_2189)))))))))) (Let __iruid_2190 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_2191 (CollectDistributedArray table_aggregate_singlestage __iruid_2192 __iruid_2193 (Let __iruid_2194 (ArrayLen (GetField rows (Ref __iruid_2187))) (Let __iruid_2195 (ToArray (StreamMap __iruid_2196 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_2196) (Ref __iruid_2194)) (I32 16)))) (Let __iruid_2197 (MakeArray Array[Int32] (Ref __iruid_2194)) (Let __iruid_2198 (If (IsNA (Ref __iruid_2195)) (NA Array[Int32]) (If (IsNA (Ref __iruid_2197)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_2199 (MakeStream Stream[Array[Int32]] False (Ref __iruid_2195) (Ref __iruid_2197)) (ToStream False (Ref __iruid_2199)))))) (Let __iruid_2200 (GetField rows (Ref __iruid_2187)) (StreamMap __iruid_2201 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_2202 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_2198) (Ref __iruid_2201)) (ArrayRef -1 (Ref __iruid_2198) (ApplyBinaryPrimOp Add (Ref __iruid_2201) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_2200) (Ref __iruid_2202)))))))))) (MakeStruct (__iruid_1169 (Ref __iruid_2190))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_2193))))) (StreamFor __iruid_2203 (ToStream True (Ref __iruid_2192)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_2203))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_2190)))) (StreamFor __iruid_2204 (ToStream True (Ref __iruid_2191)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_2204)))))) (Let __iruid_2205 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_2205)))))))))) 2023-04-22 21:13:42.060 : INFO: after LowerArrayAggsToRunAggs: IR size 177: (MakeTuple (0) (Let __iruid_2181 (ToDict (StreamMap __iruid_2182 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_2182)) (SelectFields (scores) (Ref __iruid_2182))))) (Let __iruid_2183 (ToArray (StreamMap __iruid_2184 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_2184) None (__uid_4 (Let __iruid_2185 (MakeStruct (s (GetField s (Ref __iruid_2184)))) (If (IsNA (Ref __iruid_2181)) (NA Struct{scores:Array[Float64]}) (Let __iruid_2186 (LowerBoundOnOrderedCollection True (Ref __iruid_2181) (Ref __iruid_2185)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_2186) (ArrayLen (CastToArray (Ref __iruid_2181)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_2181)) (Ref __iruid_2186))) (Ref __iruid_2185)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_2181)) (Ref __iruid_2186))) (NA Struct{scores:Array[Float64]})))))))))) (Let __iruid_2187 (MakeStruct (rows (ToArray (StreamMap __iruid_2188 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2183)) (I32 1)) (Let __iruid_2189 (ArrayRef -1 (Ref __iruid_2183) (Ref __iruid_2188)) (InsertFields (SelectFields () (Ref __iruid_2189)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_2189)))))))))) (Let __iruid_2190 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_2191 (CollectDistributedArray table_aggregate_singlestage __iruid_2192 __iruid_2193 (Let __iruid_2194 (ArrayLen (GetField rows (Ref __iruid_2187))) (Let __iruid_2195 (ToArray (StreamMap __iruid_2196 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_2196) (Ref __iruid_2194)) (I32 16)))) (Let __iruid_2197 (MakeArray Array[Int32] (Ref __iruid_2194)) (Let __iruid_2198 (If (IsNA (Ref __iruid_2195)) (NA Array[Int32]) (If (IsNA (Ref __iruid_2197)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_2199 (MakeStream Stream[Array[Int32]] False (Ref __iruid_2195) (Ref __iruid_2197)) (ToStream False (Ref __iruid_2199)))))) (Let __iruid_2200 (GetField rows (Ref __iruid_2187)) (StreamMap __iruid_2201 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_2202 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_2198) (Ref __iruid_2201)) (ArrayRef -1 (Ref __iruid_2198) (ApplyBinaryPrimOp Add (Ref __iruid_2201) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_2200) (Ref __iruid_2202)))))))))) (MakeStruct (__iruid_1169 (Ref __iruid_2190))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_2193))))) (StreamFor __iruid_2203 (ToStream True (Ref __iruid_2192)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_2203))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_2190)))) (StreamFor __iruid_2204 (ToStream True (Ref __iruid_2191)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_2204)))))) (Let __iruid_2205 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_2205)))))))))) 2023-04-22 21:13:42.112 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 177: (MakeTuple (0) (Let __iruid_2256 (ToDict (StreamMap __iruid_2257 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_2257)) (SelectFields (scores) (Ref __iruid_2257))))) (Let __iruid_2258 (ToArray (StreamMap __iruid_2259 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_2259) None (__uid_4 (Let __iruid_2260 (MakeStruct (s (GetField s (Ref __iruid_2259)))) (If (IsNA (Ref __iruid_2256)) (NA Struct{scores:Array[Float64]}) (Let __iruid_2261 (LowerBoundOnOrderedCollection True (Ref __iruid_2256) (Ref __iruid_2260)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_2261) (ArrayLen (CastToArray (Ref __iruid_2256)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_2256)) (Ref __iruid_2261))) (Ref __iruid_2260)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_2256)) (Ref __iruid_2261))) (NA Struct{scores:Array[Float64]})))))))))) (Let __iruid_2262 (MakeStruct (rows (ToArray (StreamMap __iruid_2263 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2258)) (I32 1)) (Let __iruid_2264 (ArrayRef -1 (Ref __iruid_2258) (Ref __iruid_2263)) (InsertFields (SelectFields () (Ref __iruid_2264)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_2264)))))))))) (Let __iruid_2265 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_2266 (CollectDistributedArray table_aggregate_singlestage __iruid_2267 __iruid_2268 (Let __iruid_2269 (ArrayLen (GetField rows (Ref __iruid_2262))) (Let __iruid_2270 (ToArray (StreamMap __iruid_2271 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_2271) (Ref __iruid_2269)) (I32 16)))) (Let __iruid_2272 (MakeArray Array[Int32] (Ref __iruid_2269)) (Let __iruid_2273 (If (IsNA (Ref __iruid_2270)) (NA Array[Int32]) (If (IsNA (Ref __iruid_2272)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_2274 (MakeStream Stream[Array[Int32]] False (Ref __iruid_2270) (Ref __iruid_2272)) (ToStream False (Ref __iruid_2274)))))) (Let __iruid_2275 (GetField rows (Ref __iruid_2262)) (StreamMap __iruid_2276 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_2277 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_2273) (Ref __iruid_2276)) (ArrayRef -1 (Ref __iruid_2273) (ApplyBinaryPrimOp Add (Ref __iruid_2276) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_2275) (Ref __iruid_2277)))))))))) (MakeStruct (__iruid_1169 (Ref __iruid_2265))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_2268))))) (StreamFor __iruid_2278 (ToStream True (Ref __iruid_2267)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_2278))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_2265)))) (StreamFor __iruid_2279 (ToStream True (Ref __iruid_2266)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_2279)))))) (Let __iruid_2280 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_2280)))))))))) 2023-04-22 21:13:42.230 : INFO: instruction count: 3: __C1938HailClassLoaderContainer. 2023-04-22 21:13:42.230 : INFO: instruction count: 3: __C1938HailClassLoaderContainer. 2023-04-22 21:13:42.230 : INFO: instruction count: 3: __C1940FSContainer. 2023-04-22 21:13:42.230 : INFO: instruction count: 3: __C1940FSContainer. 2023-04-22 21:13:42.244 : INFO: instruction count: 3: __C1942collect_distributed_array_table_aggregate_singlestage. 2023-04-22 21:13:42.244 : INFO: instruction count: 282: __C1942collect_distributed_array_table_aggregate_singlestage.apply 2023-04-22 21:13:42.245 : INFO: instruction count: 17: __C1942collect_distributed_array_table_aggregate_singlestage.apply 2023-04-22 21:13:42.245 : INFO: instruction count: 58: __C1942collect_distributed_array_table_aggregate_singlestage.__m1944DECODE_r_struct_of_o_array_of_r_struct_of_o_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:42.245 : INFO: instruction count: 58: __C1942collect_distributed_array_table_aggregate_singlestage.__m1945INPLACE_DECODE_o_array_of_r_struct_of_o_array_of_r_float64END_TO_o_array_of_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:42.245 : INFO: instruction count: 48: __C1942collect_distributed_array_table_aggregate_singlestage.__m1946INPLACE_DECODE_r_struct_of_o_array_of_r_float64END_TO_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:42.245 : INFO: instruction count: 58: __C1942collect_distributed_array_table_aggregate_singlestage.__m1947INPLACE_DECODE_o_array_of_r_float64_TO_o_array_of_r_float64 2023-04-22 21:13:42.245 : INFO: instruction count: 10: __C1942collect_distributed_array_table_aggregate_singlestage.__m1948INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:42.245 : INFO: instruction count: 27: __C1942collect_distributed_array_table_aggregate_singlestage.__m1953DECODE_r_struct_of_r_struct_of_r_struct_of_r_binaryENDENDEND_TO_SBaseStructPointer 2023-04-22 21:13:42.245 : INFO: instruction count: 17: __C1942collect_distributed_array_table_aggregate_singlestage.__m1954INPLACE_DECODE_r_struct_of_r_struct_of_r_binaryENDEND_TO_r_struct_of_r_tuple_of_r_binaryENDEND 2023-04-22 21:13:42.245 : INFO: instruction count: 17: __C1942collect_distributed_array_table_aggregate_singlestage.__m1955INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_tuple_of_r_binaryEND 2023-04-22 21:13:42.245 : INFO: instruction count: 31: __C1942collect_distributed_array_table_aggregate_singlestage.__m1956INPLACE_DECODE_r_binary_TO_r_binary 2023-04-22 21:13:42.245 : INFO: instruction count: 9: __C1942collect_distributed_array_table_aggregate_singlestage.__m1962begin_group_0 2023-04-22 21:13:42.245 : INFO: instruction count: 76: __C1942collect_distributed_array_table_aggregate_singlestage.__m1963begin_group_0 2023-04-22 21:13:42.245 : INFO: instruction count: 11: __C1942collect_distributed_array_table_aggregate_singlestage.__m1966setup_null 2023-04-22 21:13:42.245 : INFO: instruction count: 73: __C1942collect_distributed_array_table_aggregate_singlestage.__m1967split_StreamFor 2023-04-22 21:13:42.245 : INFO: instruction count: 55: __C1942collect_distributed_array_table_aggregate_singlestage.__m1975begin_group_0 2023-04-22 21:13:42.245 : INFO: instruction count: 5: __C1942collect_distributed_array_table_aggregate_singlestage.__m1976toInt64 2023-04-22 21:13:42.245 : INFO: instruction count: 11: __C1942collect_distributed_array_table_aggregate_singlestage.__m1980setup_null 2023-04-22 21:13:42.245 : INFO: instruction count: 9: __C1942collect_distributed_array_table_aggregate_singlestage.__m1983ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryENDEND 2023-04-22 21:13:42.245 : INFO: instruction count: 13: __C1942collect_distributed_array_table_aggregate_singlestage.__m1984ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryEND 2023-04-22 21:13:42.246 : INFO: instruction count: 16: __C1942collect_distributed_array_table_aggregate_singlestage.__m1985ENCODE_SBinaryPointer_TO_r_binary 2023-04-22 21:13:42.246 : INFO: instruction count: 9: __C1942collect_distributed_array_table_aggregate_singlestage.setPartitionIndex 2023-04-22 21:13:42.246 : INFO: instruction count: 4: __C1942collect_distributed_array_table_aggregate_singlestage.addPartitionRegion 2023-04-22 21:13:42.246 : INFO: instruction count: 4: __C1942collect_distributed_array_table_aggregate_singlestage.setPool 2023-04-22 21:13:42.246 : INFO: instruction count: 3: __C1942collect_distributed_array_table_aggregate_singlestage.addHailClassLoader 2023-04-22 21:13:42.246 : INFO: instruction count: 3: __C1942collect_distributed_array_table_aggregate_singlestage.addFS 2023-04-22 21:13:42.246 : INFO: instruction count: 4: __C1942collect_distributed_array_table_aggregate_singlestage.addTaskContext 2023-04-22 21:13:42.266 : INFO: encoder cache hit 2023-04-22 21:13:42.266 MemoryStore: INFO: Block broadcast_169 stored as values in memory (estimated size 435.4 KiB, free 25.1 GiB) 2023-04-22 21:13:42.272 MemoryStore: INFO: Block broadcast_169_piece0 stored as bytes in memory (estimated size 368.6 KiB, free 25.1 GiB) 2023-04-22 21:13:42.272 BlockManagerInfo: INFO: Added broadcast_169_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 368.6 KiB, free: 25.3 GiB) 2023-04-22 21:13:42.273 SparkContext: INFO: Created broadcast 169 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:42.273 : INFO: instruction count: 3: __C1773HailClassLoaderContainer. 2023-04-22 21:13:42.273 : INFO: instruction count: 3: __C1773HailClassLoaderContainer. 2023-04-22 21:13:42.273 : INFO: instruction count: 3: __C1775FSContainer. 2023-04-22 21:13:42.273 : INFO: instruction count: 3: __C1775FSContainer. 2023-04-22 21:13:42.319 : INFO: instruction count: 3: __C1777Compiled. 2023-04-22 21:13:42.319 : INFO: instruction count: 59: __C1777Compiled.apply 2023-04-22 21:13:42.341 : INFO: instruction count: 377: __C1777Compiled.__m1779split_ToDict 2023-04-22 21:13:42.341 : INFO: instruction count: 12: __C1777Compiled.__m1789setup_jab 2023-04-22 21:13:42.342 : INFO: instruction count: 202: __C1777Compiled.__m1792arraySorter_outer 2023-04-22 21:13:42.342 : INFO: instruction count: 93: __C1777Compiled.__m1793arraySorter_merge 2023-04-22 21:13:42.342 : INFO: instruction count: 11: __C1777Compiled.__m1796ord_lt 2023-04-22 21:13:42.342 : INFO: instruction count: 52: __C1777Compiled.__m1797ord_ltNonnull 2023-04-22 21:13:42.342 : INFO: instruction count: 11: __C1777Compiled.__m1798ord_lt 2023-04-22 21:13:42.342 : INFO: instruction count: 21: __C1777Compiled.__m1799ord_ltNonnull 2023-04-22 21:13:42.342 : INFO: instruction count: 9: __C1777Compiled.__m1800ord_compareNonnull 2023-04-22 21:13:42.342 : INFO: instruction count: 89: __C1777Compiled.__m1801ord_compareNonnull 2023-04-22 21:13:42.342 : INFO: instruction count: 11: __C1777Compiled.__m1802ord_equiv 2023-04-22 21:13:42.342 : INFO: instruction count: 21: __C1777Compiled.__m1803ord_equivNonnull 2023-04-22 21:13:42.342 : INFO: instruction count: 36: __C1777Compiled.__m1804arraySorter_splitMerge 2023-04-22 21:13:42.343 : INFO: instruction count: 264: __C1777Compiled.__m1805distinctFromSorted 2023-04-22 21:13:42.343 : INFO: instruction count: 30: __C1777Compiled.__m1808ord_equiv 2023-04-22 21:13:42.343 : INFO: instruction count: 39: __C1777Compiled.__m1809ord_equivNonnull 2023-04-22 21:13:42.343 : INFO: instruction count: 211: __C1777Compiled.__m1817split_ToArray 2023-04-22 21:13:42.343 : INFO: instruction count: 146: __C1777Compiled.__m1825split_Let 2023-04-22 21:13:42.343 : INFO: instruction count: 60: __C1777Compiled.__m1828findElt 2023-04-22 21:13:42.343 : INFO: instruction count: 11: __C1777Compiled.__m1829ord_lt 2023-04-22 21:13:42.343 : INFO: instruction count: 44: __C1777Compiled.__m1830ord_ltNonnull 2023-04-22 21:13:42.343 : INFO: instruction count: 11: __C1777Compiled.__m1832ord_equiv 2023-04-22 21:13:42.343 : INFO: instruction count: 14: __C1777Compiled.__m1833ord_equivNonnull 2023-04-22 21:13:42.343 : INFO: instruction count: 35: __C1777Compiled.__m1834arrayref_bounds_check 2023-04-22 21:13:42.343 : INFO: instruction count: 11: __C1777Compiled.__m1835ord_equiv 2023-04-22 21:13:42.343 : INFO: instruction count: 31: __C1777Compiled.__m1836ord_equivNonnull 2023-04-22 21:13:42.344 : INFO: instruction count: 245: __C1777Compiled.__m1845split_Let 2023-04-22 21:13:42.344 : INFO: instruction count: 255: __C1777Compiled.__m1847split_ToArray 2023-04-22 21:13:42.344 : INFO: instruction count: 9: __C1777Compiled.__m1865begin_group_0 2023-04-22 21:13:42.344 : INFO: instruction count: 11: __C1777Compiled.__m1868setup_null 2023-04-22 21:13:42.345 : INFO: instruction count: 503: __C1777Compiled.__m1872split_CollectDistributedArray 2023-04-22 21:13:42.345 : INFO: instruction count: 177: __C1777Compiled.__m1875split_ToArray 2023-04-22 21:13:42.345 : INFO: instruction count: 295: __C1777Compiled.__m1891split_ToArray 2023-04-22 21:13:42.345 : INFO: instruction count: 12: __C1777Compiled.__m1911setup_iab 2023-04-22 21:13:42.345 : INFO: instruction count: 255: __C1777Compiled.__m1927split_ToArray 2023-04-22 21:13:42.345 : INFO: instruction count: 4: __C1777Compiled.setBackend 2023-04-22 21:13:42.345 : INFO: instruction count: 48: __C1777Compiled.__m1994ENCODE_SBaseStructPointer_TO_r_struct_of_o_array_of_r_struct_of_o_array_of_r_float64ENDEND 2023-04-22 21:13:42.346 : INFO: instruction count: 35: __C1777Compiled.__m1995ENCODE_SIndexablePointer_TO_o_array_of_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:42.346 : INFO: instruction count: 48: __C1777Compiled.__m1996ENCODE_SBaseStructPointer_TO_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:42.346 : INFO: instruction count: 39: __C1777Compiled.__m1997ENCODE_SIndexablePointer_TO_o_array_of_r_float64 2023-04-22 21:13:42.346 : INFO: instruction count: 4: __C1777Compiled.__m1998ENCODE_SFloat64$_TO_r_float64 2023-04-22 21:13:42.346 : INFO: instruction count: 9: __C1777Compiled.__m1999ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_binaryENDENDEND 2023-04-22 21:13:42.346 : INFO: instruction count: 9: __C1777Compiled.__m2000ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryENDEND 2023-04-22 21:13:42.346 : INFO: instruction count: 13: __C1777Compiled.__m2001ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryEND 2023-04-22 21:13:42.346 : INFO: instruction count: 16: __C1777Compiled.__m2002ENCODE_SBinaryPointer_TO_r_binary 2023-04-22 21:13:42.346 : INFO: instruction count: 27: __C1777Compiled.__m2005DECODE_r_struct_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:42.346 : INFO: instruction count: 17: __C1777Compiled.__m2006INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_tuple_of_r_binaryEND 2023-04-22 21:13:42.346 : INFO: instruction count: 31: __C1777Compiled.__m2007INPLACE_DECODE_r_binary_TO_r_binary 2023-04-22 21:13:42.346 : INFO: instruction count: 9: __C1777Compiled.__m2017begin_group_0 2023-04-22 21:13:42.346 : INFO: instruction count: 64: __C1777Compiled.__m2018begin_group_0 2023-04-22 21:13:42.346 : INFO: instruction count: 11: __C1777Compiled.__m2021setup_null 2023-04-22 21:13:42.346 : INFO: instruction count: 68: __C1777Compiled.__m2022split_StreamFor 2023-04-22 21:13:42.346 : INFO: instruction count: 84: __C1777Compiled.__m2030begin_group_0 2023-04-22 21:13:42.346 : INFO: instruction count: 9: __C1777Compiled.setPartitionIndex 2023-04-22 21:13:42.346 : INFO: instruction count: 4: __C1777Compiled.addPartitionRegion 2023-04-22 21:13:42.346 : INFO: instruction count: 4: __C1777Compiled.setPool 2023-04-22 21:13:42.346 : INFO: instruction count: 3: __C1777Compiled.addHailClassLoader 2023-04-22 21:13:42.346 : INFO: instruction count: 3: __C1777Compiled.addFS 2023-04-22 21:13:42.346 : INFO: instruction count: 4: __C1777Compiled.addTaskContext 2023-04-22 21:13:42.346 : INFO: instruction count: 3: __C1777Compiled.setObjects 2023-04-22 21:13:42.347 : INFO: instruction count: 96: __C1777Compiled.addAndDecodeLiterals 2023-04-22 21:13:42.347 : INFO: instruction count: 18: __C1777Compiled.__m2039DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:42.347 : INFO: instruction count: 27: __C1777Compiled.__m2040DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:42.347 : INFO: instruction count: 58: __C1777Compiled.__m2041INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:13:42.347 : INFO: instruction count: 17: __C1777Compiled.__m2042INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:13:42.347 : INFO: instruction count: 31: __C1777Compiled.__m2043INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:42.350 : INFO: instruction count: 27: __C1777Compiled.__m2044DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:42.350 : INFO: instruction count: 58: __C1777Compiled.__m2045INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:42.350 : INFO: instruction count: 26: __C1777Compiled.__m2046INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:42.350 : INFO: instruction count: 58: __C1777Compiled.__m2047INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:42.350 : INFO: instruction count: 10: __C1777Compiled.__m2048INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:42.350 : INFO: instruction count: 3: __C2003staticWrapperClass_1. 2023-04-22 21:13:42.353 : INFO: initial IR: IR size 133: (MakeTuple (0) (Let __iruid_1676 (ToDict (StreamMap __iruid_1677 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_1677)) (SelectFields (scores) (Ref __iruid_1677))))) (Let __iruid_1678 (ToArray (StreamMap __iruid_1679 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_1679) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_1676) (MakeStruct (s (GetField s (Ref __iruid_1679))))))))) (Let __iruid_1680 (MakeStruct (rows (ToArray (StreamMap __iruid_1681 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_1678)) (I32 1)) (Let __iruid_1682 (ArrayRef -1 (Ref __iruid_1678) (Ref __iruid_1681)) (InsertFields (SelectFields () (Ref __iruid_1682)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_1682)))))))))) (Let __iruid_1683 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_1684 (CollectDistributedArray table_aggregate_singlestage __iruid_1685 __iruid_1686 (Let __iruid_1687 (ArrayLen (GetField rows (Ref __iruid_1680))) (Let __iruid_1688 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_1689 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_1689) (Ref __iruid_1687)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_1687))) (Let __iruid_1690 (GetField rows (Ref __iruid_1680)) (StreamMap __iruid_1691 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_1692 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_1688) (Ref __iruid_1691)) (ArrayRef -1 (Ref __iruid_1688) (ApplyBinaryPrimOp Add (Ref __iruid_1691) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_1690) (Ref __iruid_1692)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_1683))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_1686))))) (StreamFor __iruid_1693 (ToStream True (Ref __iruid_1685)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_1693))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_1683)))) (StreamFor __iruid_1694 (ToStream True (Ref __iruid_1684)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_1694)))))) (Let __iruid_1695 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_1695)))))))))) 2023-04-22 21:13:42.409 : INFO: after optimize: compileLowerer, initial IR: IR size 133: (MakeTuple (0) (Let __iruid_2376 (ToDict (StreamMap __iruid_2377 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_2377)) (SelectFields (scores) (Ref __iruid_2377))))) (Let __iruid_2378 (ToArray (StreamMap __iruid_2379 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_2379) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_2376) (MakeStruct (s (GetField s (Ref __iruid_2379))))))))) (Let __iruid_2380 (MakeStruct (rows (ToArray (StreamMap __iruid_2381 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2378)) (I32 1)) (Let __iruid_2382 (ArrayRef -1 (Ref __iruid_2378) (Ref __iruid_2381)) (InsertFields (SelectFields () (Ref __iruid_2382)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_2382)))))))))) (Let __iruid_2383 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_2384 (CollectDistributedArray table_aggregate_singlestage __iruid_2385 __iruid_2386 (Let __iruid_2387 (ArrayLen (GetField rows (Ref __iruid_2380))) (Let __iruid_2388 (ApplyIR -1 extend () Array[Int32] (ToArray (StreamMap __iruid_2389 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_2389) (Ref __iruid_2387)) (I32 16)))) (MakeArray Array[Int32] (Ref __iruid_2387))) (Let __iruid_2390 (GetField rows (Ref __iruid_2380)) (StreamMap __iruid_2391 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_2392 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_2388) (Ref __iruid_2391)) (ArrayRef -1 (Ref __iruid_2388) (ApplyBinaryPrimOp Add (Ref __iruid_2391) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_2390) (Ref __iruid_2392)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_2383))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_2386))))) (StreamFor __iruid_2393 (ToStream True (Ref __iruid_2385)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_2393))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_2383)))) (StreamFor __iruid_2394 (ToStream True (Ref __iruid_2384)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_2394)))))) (Let __iruid_2395 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_2395)))))))))) 2023-04-22 21:13:42.426 : INFO: after InlineApplyIR: IR size 179: (MakeTuple (0) (Let __iruid_2376 (ToDict (StreamMap __iruid_2377 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_2377)) (SelectFields (scores) (Ref __iruid_2377))))) (Let __iruid_2378 (ToArray (StreamMap __iruid_2379 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_2379) None (__uid_4 (Let __iruid_2408 (Ref __iruid_2376) (Let __iruid_2409 (MakeStruct (s (GetField s (Ref __iruid_2379)))) (If (IsNA (Ref __iruid_2408)) (NA Struct{scores:Array[Float64]}) (Let __iruid_2410 (LowerBoundOnOrderedCollection True (Ref __iruid_2408) (Ref __iruid_2409)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_2410) (ArrayLen (CastToArray (Ref __iruid_2408)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_2408)) (Ref __iruid_2410))) (Ref __iruid_2409)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_2408)) (Ref __iruid_2410))) (NA Struct{scores:Array[Float64]}))))))))))) (Let __iruid_2380 (MakeStruct (rows (ToArray (StreamMap __iruid_2381 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2378)) (I32 1)) (Let __iruid_2382 (ArrayRef -1 (Ref __iruid_2378) (Ref __iruid_2381)) (InsertFields (SelectFields () (Ref __iruid_2382)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_2382)))))))))) (Let __iruid_2383 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_2384 (CollectDistributedArray table_aggregate_singlestage __iruid_2385 __iruid_2386 (Let __iruid_2387 (ArrayLen (GetField rows (Ref __iruid_2380))) (Let __iruid_2388 (Let __iruid_2411 (ToArray (StreamMap __iruid_2389 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_2389) (Ref __iruid_2387)) (I32 16)))) (Let __iruid_2412 (MakeArray Array[Int32] (Ref __iruid_2387)) (If (IsNA (Ref __iruid_2411)) (NA Array[Int32]) (If (IsNA (Ref __iruid_2412)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_2413 (MakeStream Stream[Array[Int32]] False (Ref __iruid_2411) (Ref __iruid_2412)) (ToStream False (Ref __iruid_2413)))))))) (Let __iruid_2390 (GetField rows (Ref __iruid_2380)) (StreamMap __iruid_2391 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_2392 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_2388) (Ref __iruid_2391)) (ArrayRef -1 (Ref __iruid_2388) (ApplyBinaryPrimOp Add (Ref __iruid_2391) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_2390) (Ref __iruid_2392)))))))) (MakeStruct (__iruid_1169 (Ref __iruid_2383))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_2386))))) (StreamFor __iruid_2393 (ToStream True (Ref __iruid_2385)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_2393))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_2383)))) (StreamFor __iruid_2394 (ToStream True (Ref __iruid_2384)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_2394)))))) (Let __iruid_2395 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_2395)))))))))) 2023-04-22 21:13:42.494 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 177: (MakeTuple (0) (Let __iruid_2465 (ToDict (StreamMap __iruid_2466 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_2466)) (SelectFields (scores) (Ref __iruid_2466))))) (Let __iruid_2467 (ToArray (StreamMap __iruid_2468 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_2468) None (__uid_4 (Let __iruid_2469 (MakeStruct (s (GetField s (Ref __iruid_2468)))) (If (IsNA (Ref __iruid_2465)) (NA Struct{scores:Array[Float64]}) (Let __iruid_2470 (LowerBoundOnOrderedCollection True (Ref __iruid_2465) (Ref __iruid_2469)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_2470) (ArrayLen (CastToArray (Ref __iruid_2465)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_2465)) (Ref __iruid_2470))) (Ref __iruid_2469)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_2465)) (Ref __iruid_2470))) (NA Struct{scores:Array[Float64]})))))))))) (Let __iruid_2471 (MakeStruct (rows (ToArray (StreamMap __iruid_2472 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2467)) (I32 1)) (Let __iruid_2473 (ArrayRef -1 (Ref __iruid_2467) (Ref __iruid_2472)) (InsertFields (SelectFields () (Ref __iruid_2473)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_2473)))))))))) (Let __iruid_2474 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_2475 (CollectDistributedArray table_aggregate_singlestage __iruid_2476 __iruid_2477 (Let __iruid_2478 (ArrayLen (GetField rows (Ref __iruid_2471))) (Let __iruid_2479 (ToArray (StreamMap __iruid_2480 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_2480) (Ref __iruid_2478)) (I32 16)))) (Let __iruid_2481 (MakeArray Array[Int32] (Ref __iruid_2478)) (Let __iruid_2482 (If (IsNA (Ref __iruid_2479)) (NA Array[Int32]) (If (IsNA (Ref __iruid_2481)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_2483 (MakeStream Stream[Array[Int32]] False (Ref __iruid_2479) (Ref __iruid_2481)) (ToStream False (Ref __iruid_2483)))))) (Let __iruid_2484 (GetField rows (Ref __iruid_2471)) (StreamMap __iruid_2485 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_2486 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_2482) (Ref __iruid_2485)) (ArrayRef -1 (Ref __iruid_2482) (ApplyBinaryPrimOp Add (Ref __iruid_2485) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_2484) (Ref __iruid_2486)))))))))) (MakeStruct (__iruid_1169 (Ref __iruid_2474))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_2477))))) (StreamFor __iruid_2487 (ToStream True (Ref __iruid_2476)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_2487))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_2474)))) (StreamFor __iruid_2488 (ToStream True (Ref __iruid_2475)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_2488)))))) (Let __iruid_2489 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_2489)))))))))) 2023-04-22 21:13:42.520 : INFO: after LowerArrayAggsToRunAggs: IR size 177: (MakeTuple (0) (Let __iruid_2465 (ToDict (StreamMap __iruid_2466 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_2466)) (SelectFields (scores) (Ref __iruid_2466))))) (Let __iruid_2467 (ToArray (StreamMap __iruid_2468 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_2468) None (__uid_4 (Let __iruid_2469 (MakeStruct (s (GetField s (Ref __iruid_2468)))) (If (IsNA (Ref __iruid_2465)) (NA Struct{scores:Array[Float64]}) (Let __iruid_2470 (LowerBoundOnOrderedCollection True (Ref __iruid_2465) (Ref __iruid_2469)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_2470) (ArrayLen (CastToArray (Ref __iruid_2465)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_2465)) (Ref __iruid_2470))) (Ref __iruid_2469)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_2465)) (Ref __iruid_2470))) (NA Struct{scores:Array[Float64]})))))))))) (Let __iruid_2471 (MakeStruct (rows (ToArray (StreamMap __iruid_2472 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2467)) (I32 1)) (Let __iruid_2473 (ArrayRef -1 (Ref __iruid_2467) (Ref __iruid_2472)) (InsertFields (SelectFields () (Ref __iruid_2473)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_2473)))))))))) (Let __iruid_2474 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_2475 (CollectDistributedArray table_aggregate_singlestage __iruid_2476 __iruid_2477 (Let __iruid_2478 (ArrayLen (GetField rows (Ref __iruid_2471))) (Let __iruid_2479 (ToArray (StreamMap __iruid_2480 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_2480) (Ref __iruid_2478)) (I32 16)))) (Let __iruid_2481 (MakeArray Array[Int32] (Ref __iruid_2478)) (Let __iruid_2482 (If (IsNA (Ref __iruid_2479)) (NA Array[Int32]) (If (IsNA (Ref __iruid_2481)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_2483 (MakeStream Stream[Array[Int32]] False (Ref __iruid_2479) (Ref __iruid_2481)) (ToStream False (Ref __iruid_2483)))))) (Let __iruid_2484 (GetField rows (Ref __iruid_2471)) (StreamMap __iruid_2485 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_2486 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_2482) (Ref __iruid_2485)) (ArrayRef -1 (Ref __iruid_2482) (ApplyBinaryPrimOp Add (Ref __iruid_2485) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_2484) (Ref __iruid_2486)))))))))) (MakeStruct (__iruid_1169 (Ref __iruid_2474))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_2477))))) (StreamFor __iruid_2487 (ToStream True (Ref __iruid_2476)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_2487))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_2474)))) (StreamFor __iruid_2488 (ToStream True (Ref __iruid_2475)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_2488)))))) (Let __iruid_2489 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_2489)))))))))) 2023-04-22 21:13:42.601 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 177: (MakeTuple (0) (Let __iruid_2540 (ToDict (StreamMap __iruid_2541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_2541)) (SelectFields (scores) (Ref __iruid_2541))))) (Let __iruid_2542 (ToArray (StreamMap __iruid_2543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_2543) None (__uid_4 (Let __iruid_2544 (MakeStruct (s (GetField s (Ref __iruid_2543)))) (If (IsNA (Ref __iruid_2540)) (NA Struct{scores:Array[Float64]}) (Let __iruid_2545 (LowerBoundOnOrderedCollection True (Ref __iruid_2540) (Ref __iruid_2544)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_2545) (ArrayLen (CastToArray (Ref __iruid_2540)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_2540)) (Ref __iruid_2545))) (Ref __iruid_2544)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_2540)) (Ref __iruid_2545))) (NA Struct{scores:Array[Float64]})))))))))) (Let __iruid_2546 (MakeStruct (rows (ToArray (StreamMap __iruid_2547 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2542)) (I32 1)) (Let __iruid_2548 (ArrayRef -1 (Ref __iruid_2542) (Ref __iruid_2547)) (InsertFields (SelectFields () (Ref __iruid_2548)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_2548)))))))))) (Let __iruid_2549 (RunAgg ((TypedStateSig +PInt64)) (Begin (InitOp 0 (Sum (TypedStateSig +PInt64)) ())) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (Let __iruid_2550 (CollectDistributedArray table_aggregate_singlestage __iruid_2551 __iruid_2552 (Let __iruid_2553 (ArrayLen (GetField rows (Ref __iruid_2546))) (Let __iruid_2554 (ToArray (StreamMap __iruid_2555 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ApplyBinaryPrimOp RoundToNegInfDivide (ApplyBinaryPrimOp Multiply (Ref __iruid_2555) (Ref __iruid_2553)) (I32 16)))) (Let __iruid_2556 (MakeArray Array[Int32] (Ref __iruid_2553)) (Let __iruid_2557 (If (IsNA (Ref __iruid_2554)) (NA Array[Int32]) (If (IsNA (Ref __iruid_2556)) (NA Array[Int32]) (ToArray (StreamFlatMap __iruid_2558 (MakeStream Stream[Array[Int32]] False (Ref __iruid_2554) (Ref __iruid_2556)) (ToStream False (Ref __iruid_2558)))))) (Let __iruid_2559 (GetField rows (Ref __iruid_2546)) (StreamMap __iruid_2560 (StreamRange -1 False (I32 0) (I32 16) (I32 1)) (ToArray (StreamMap __iruid_2561 (StreamRange -1 False (ArrayRef -1 (Ref __iruid_2557) (Ref __iruid_2560)) (ArrayRef -1 (Ref __iruid_2557) (ApplyBinaryPrimOp Add (Ref __iruid_2560) (I32 1))) (I32 1)) (ArrayRef -1 (Ref __iruid_2559) (Ref __iruid_2561)))))))))) (MakeStruct (__iruid_1169 (Ref __iruid_2549))) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (GetField __iruid_1169 (Ref __iruid_2552))))) (StreamFor __iruid_2562 (ToStream True (Ref __iruid_2551)) (Begin (SeqOp 0 (Sum (TypedStateSig +PInt64)) ((Apply 12 toInt64 () Int64 (IsNA (GetField __scores (Ref __iruid_2562))))))))) (MakeTuple (0) (AggStateValue 0 (TypedStateSig +PInt64)))) (NA String)) (RunAgg ((TypedStateSig +PInt64)) (Begin (Begin (InitFromSerializedValue 0 (TypedStateSig +PInt64) (GetTupleElement 0 (Ref __iruid_2549)))) (StreamFor __iruid_2563 (ToStream True (Ref __iruid_2550)) (Begin (CombOpValue 0 (Sum (TypedStateSig +PInt64)) (GetTupleElement 0 (Ref __iruid_2563)))))) (Let __iruid_2564 (MakeTuple (0) (ResultOp 0 (Sum (TypedStateSig +PInt64)))) (GetTupleElement 0 (Ref __iruid_2564)))))))))) 2023-04-22 21:13:42.703 : INFO: instruction count: 3: __C2214HailClassLoaderContainer. 2023-04-22 21:13:42.703 : INFO: instruction count: 3: __C2214HailClassLoaderContainer. 2023-04-22 21:13:42.703 : INFO: instruction count: 3: __C2216FSContainer. 2023-04-22 21:13:42.703 : INFO: instruction count: 3: __C2216FSContainer. 2023-04-22 21:13:42.707 : INFO: instruction count: 3: __C2218collect_distributed_array_table_aggregate_singlestage. 2023-04-22 21:13:42.708 : INFO: instruction count: 282: __C2218collect_distributed_array_table_aggregate_singlestage.apply 2023-04-22 21:13:42.708 : INFO: instruction count: 17: __C2218collect_distributed_array_table_aggregate_singlestage.apply 2023-04-22 21:13:42.708 : INFO: instruction count: 58: __C2218collect_distributed_array_table_aggregate_singlestage.__m2220DECODE_r_struct_of_o_array_of_r_struct_of_o_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:42.708 : INFO: instruction count: 58: __C2218collect_distributed_array_table_aggregate_singlestage.__m2221INPLACE_DECODE_o_array_of_r_struct_of_o_array_of_r_float64END_TO_o_array_of_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:42.708 : INFO: instruction count: 48: __C2218collect_distributed_array_table_aggregate_singlestage.__m2222INPLACE_DECODE_r_struct_of_o_array_of_r_float64END_TO_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:42.708 : INFO: instruction count: 58: __C2218collect_distributed_array_table_aggregate_singlestage.__m2223INPLACE_DECODE_o_array_of_r_float64_TO_o_array_of_r_float64 2023-04-22 21:13:42.708 : INFO: instruction count: 10: __C2218collect_distributed_array_table_aggregate_singlestage.__m2224INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:42.708 : INFO: instruction count: 27: __C2218collect_distributed_array_table_aggregate_singlestage.__m2229DECODE_r_struct_of_r_struct_of_r_struct_of_r_binaryENDENDEND_TO_SBaseStructPointer 2023-04-22 21:13:42.708 : INFO: instruction count: 17: __C2218collect_distributed_array_table_aggregate_singlestage.__m2230INPLACE_DECODE_r_struct_of_r_struct_of_r_binaryENDEND_TO_r_struct_of_r_tuple_of_r_binaryENDEND 2023-04-22 21:13:42.708 : INFO: instruction count: 17: __C2218collect_distributed_array_table_aggregate_singlestage.__m2231INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_tuple_of_r_binaryEND 2023-04-22 21:13:42.727 : INFO: instruction count: 31: __C2218collect_distributed_array_table_aggregate_singlestage.__m2232INPLACE_DECODE_r_binary_TO_r_binary 2023-04-22 21:13:42.727 : INFO: instruction count: 9: __C2218collect_distributed_array_table_aggregate_singlestage.__m2238begin_group_0 2023-04-22 21:13:42.728 : INFO: instruction count: 76: __C2218collect_distributed_array_table_aggregate_singlestage.__m2239begin_group_0 2023-04-22 21:13:42.728 : INFO: instruction count: 11: __C2218collect_distributed_array_table_aggregate_singlestage.__m2242setup_null 2023-04-22 21:13:42.728 : INFO: instruction count: 73: __C2218collect_distributed_array_table_aggregate_singlestage.__m2243split_StreamFor 2023-04-22 21:13:42.728 : INFO: instruction count: 55: __C2218collect_distributed_array_table_aggregate_singlestage.__m2251begin_group_0 2023-04-22 21:13:42.728 : INFO: instruction count: 5: __C2218collect_distributed_array_table_aggregate_singlestage.__m2252toInt64 2023-04-22 21:13:42.728 : INFO: instruction count: 11: __C2218collect_distributed_array_table_aggregate_singlestage.__m2256setup_null 2023-04-22 21:13:42.728 : INFO: instruction count: 9: __C2218collect_distributed_array_table_aggregate_singlestage.__m2259ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryENDEND 2023-04-22 21:13:42.728 : INFO: instruction count: 13: __C2218collect_distributed_array_table_aggregate_singlestage.__m2260ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryEND 2023-04-22 21:13:42.728 : INFO: instruction count: 16: __C2218collect_distributed_array_table_aggregate_singlestage.__m2261ENCODE_SBinaryPointer_TO_r_binary 2023-04-22 21:13:42.728 : INFO: instruction count: 9: __C2218collect_distributed_array_table_aggregate_singlestage.setPartitionIndex 2023-04-22 21:13:42.728 : INFO: instruction count: 4: __C2218collect_distributed_array_table_aggregate_singlestage.addPartitionRegion 2023-04-22 21:13:42.728 : INFO: instruction count: 4: __C2218collect_distributed_array_table_aggregate_singlestage.setPool 2023-04-22 21:13:42.728 : INFO: instruction count: 3: __C2218collect_distributed_array_table_aggregate_singlestage.addHailClassLoader 2023-04-22 21:13:42.728 : INFO: instruction count: 3: __C2218collect_distributed_array_table_aggregate_singlestage.addFS 2023-04-22 21:13:42.728 : INFO: instruction count: 4: __C2218collect_distributed_array_table_aggregate_singlestage.addTaskContext 2023-04-22 21:13:42.751 : INFO: encoder cache hit 2023-04-22 21:13:42.752 MemoryStore: INFO: Block broadcast_170 stored as values in memory (estimated size 435.4 KiB, free 25.1 GiB) 2023-04-22 21:13:42.756 MemoryStore: INFO: Block broadcast_170_piece0 stored as bytes in memory (estimated size 368.6 KiB, free 25.1 GiB) 2023-04-22 21:13:42.757 BlockManagerInfo: INFO: Added broadcast_170_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 368.6 KiB, free: 25.3 GiB) 2023-04-22 21:13:42.757 SparkContext: INFO: Created broadcast 170 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:42.757 : INFO: instruction count: 3: __C2049HailClassLoaderContainer. 2023-04-22 21:13:42.758 : INFO: instruction count: 3: __C2049HailClassLoaderContainer. 2023-04-22 21:13:42.758 : INFO: instruction count: 3: __C2051FSContainer. 2023-04-22 21:13:42.758 : INFO: instruction count: 3: __C2051FSContainer. 2023-04-22 21:13:42.814 : INFO: instruction count: 3: __C2053Compiled. 2023-04-22 21:13:42.814 : INFO: instruction count: 59: __C2053Compiled.apply 2023-04-22 21:13:42.815 : INFO: instruction count: 377: __C2053Compiled.__m2055split_ToDict 2023-04-22 21:13:42.815 : INFO: instruction count: 12: __C2053Compiled.__m2065setup_jab 2023-04-22 21:13:42.815 : INFO: instruction count: 202: __C2053Compiled.__m2068arraySorter_outer 2023-04-22 21:13:42.815 : INFO: instruction count: 93: __C2053Compiled.__m2069arraySorter_merge 2023-04-22 21:13:42.815 : INFO: instruction count: 11: __C2053Compiled.__m2072ord_lt 2023-04-22 21:13:42.815 : INFO: instruction count: 52: __C2053Compiled.__m2073ord_ltNonnull 2023-04-22 21:13:42.815 : INFO: instruction count: 11: __C2053Compiled.__m2074ord_lt 2023-04-22 21:13:42.815 : INFO: instruction count: 21: __C2053Compiled.__m2075ord_ltNonnull 2023-04-22 21:13:42.816 : INFO: instruction count: 9: __C2053Compiled.__m2076ord_compareNonnull 2023-04-22 21:13:42.816 : INFO: instruction count: 89: __C2053Compiled.__m2077ord_compareNonnull 2023-04-22 21:13:42.816 : INFO: instruction count: 11: __C2053Compiled.__m2078ord_equiv 2023-04-22 21:13:42.816 : INFO: instruction count: 21: __C2053Compiled.__m2079ord_equivNonnull 2023-04-22 21:13:42.816 : INFO: instruction count: 36: __C2053Compiled.__m2080arraySorter_splitMerge 2023-04-22 21:13:42.816 : INFO: instruction count: 264: __C2053Compiled.__m2081distinctFromSorted 2023-04-22 21:13:42.816 : INFO: instruction count: 30: __C2053Compiled.__m2084ord_equiv 2023-04-22 21:13:42.816 : INFO: instruction count: 39: __C2053Compiled.__m2085ord_equivNonnull 2023-04-22 21:13:42.816 : INFO: instruction count: 211: __C2053Compiled.__m2093split_ToArray 2023-04-22 21:13:42.817 : INFO: instruction count: 146: __C2053Compiled.__m2101split_Let 2023-04-22 21:13:42.817 : INFO: instruction count: 60: __C2053Compiled.__m2104findElt 2023-04-22 21:13:42.817 : INFO: instruction count: 11: __C2053Compiled.__m2105ord_lt 2023-04-22 21:13:42.817 : INFO: instruction count: 44: __C2053Compiled.__m2106ord_ltNonnull 2023-04-22 21:13:42.817 : INFO: instruction count: 11: __C2053Compiled.__m2108ord_equiv 2023-04-22 21:13:42.817 : INFO: instruction count: 14: __C2053Compiled.__m2109ord_equivNonnull 2023-04-22 21:13:42.817 : INFO: instruction count: 35: __C2053Compiled.__m2110arrayref_bounds_check 2023-04-22 21:13:42.817 : INFO: instruction count: 11: __C2053Compiled.__m2111ord_equiv 2023-04-22 21:13:42.817 : INFO: instruction count: 31: __C2053Compiled.__m2112ord_equivNonnull 2023-04-22 21:13:42.817 : INFO: instruction count: 245: __C2053Compiled.__m2121split_Let 2023-04-22 21:13:42.818 : INFO: instruction count: 255: __C2053Compiled.__m2123split_ToArray 2023-04-22 21:13:42.818 : INFO: instruction count: 9: __C2053Compiled.__m2141begin_group_0 2023-04-22 21:13:42.818 : INFO: instruction count: 11: __C2053Compiled.__m2144setup_null 2023-04-22 21:13:42.818 : INFO: instruction count: 503: __C2053Compiled.__m2148split_CollectDistributedArray 2023-04-22 21:13:42.818 : INFO: instruction count: 177: __C2053Compiled.__m2151split_ToArray 2023-04-22 21:13:42.819 : INFO: instruction count: 295: __C2053Compiled.__m2167split_ToArray 2023-04-22 21:13:42.819 : INFO: instruction count: 12: __C2053Compiled.__m2187setup_iab 2023-04-22 21:13:42.819 : INFO: instruction count: 255: __C2053Compiled.__m2203split_ToArray 2023-04-22 21:13:42.819 : INFO: instruction count: 4: __C2053Compiled.setBackend 2023-04-22 21:13:42.819 : INFO: instruction count: 48: __C2053Compiled.__m2270ENCODE_SBaseStructPointer_TO_r_struct_of_o_array_of_r_struct_of_o_array_of_r_float64ENDEND 2023-04-22 21:13:42.819 : INFO: instruction count: 35: __C2053Compiled.__m2271ENCODE_SIndexablePointer_TO_o_array_of_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:42.819 : INFO: instruction count: 48: __C2053Compiled.__m2272ENCODE_SBaseStructPointer_TO_r_struct_of_o_array_of_r_float64END 2023-04-22 21:13:42.819 : INFO: instruction count: 39: __C2053Compiled.__m2273ENCODE_SIndexablePointer_TO_o_array_of_r_float64 2023-04-22 21:13:42.819 : INFO: instruction count: 4: __C2053Compiled.__m2274ENCODE_SFloat64$_TO_r_float64 2023-04-22 21:13:42.820 : INFO: instruction count: 9: __C2053Compiled.__m2275ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_binaryENDENDEND 2023-04-22 21:13:42.820 : INFO: instruction count: 9: __C2053Compiled.__m2276ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryENDEND 2023-04-22 21:13:42.820 : INFO: instruction count: 13: __C2053Compiled.__m2277ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryEND 2023-04-22 21:13:42.820 : INFO: instruction count: 16: __C2053Compiled.__m2278ENCODE_SBinaryPointer_TO_r_binary 2023-04-22 21:13:42.820 : INFO: instruction count: 27: __C2053Compiled.__m2281DECODE_r_struct_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:42.820 : INFO: instruction count: 17: __C2053Compiled.__m2282INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_tuple_of_r_binaryEND 2023-04-22 21:13:42.820 : INFO: instruction count: 31: __C2053Compiled.__m2283INPLACE_DECODE_r_binary_TO_r_binary 2023-04-22 21:13:42.820 : INFO: instruction count: 9: __C2053Compiled.__m2293begin_group_0 2023-04-22 21:13:42.820 : INFO: instruction count: 64: __C2053Compiled.__m2294begin_group_0 2023-04-22 21:13:42.820 : INFO: instruction count: 11: __C2053Compiled.__m2297setup_null 2023-04-22 21:13:42.820 : INFO: instruction count: 68: __C2053Compiled.__m2298split_StreamFor 2023-04-22 21:13:42.820 : INFO: instruction count: 84: __C2053Compiled.__m2306begin_group_0 2023-04-22 21:13:42.820 : INFO: instruction count: 9: __C2053Compiled.setPartitionIndex 2023-04-22 21:13:42.820 : INFO: instruction count: 4: __C2053Compiled.addPartitionRegion 2023-04-22 21:13:42.820 : INFO: instruction count: 4: __C2053Compiled.setPool 2023-04-22 21:13:42.820 : INFO: instruction count: 3: __C2053Compiled.addHailClassLoader 2023-04-22 21:13:42.820 : INFO: instruction count: 3: __C2053Compiled.addFS 2023-04-22 21:13:42.820 : INFO: instruction count: 4: __C2053Compiled.addTaskContext 2023-04-22 21:13:42.820 : INFO: instruction count: 3: __C2053Compiled.setObjects 2023-04-22 21:13:42.820 : INFO: instruction count: 96: __C2053Compiled.addAndDecodeLiterals 2023-04-22 21:13:42.820 : INFO: instruction count: 18: __C2053Compiled.__m2315DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:13:42.820 : INFO: instruction count: 27: __C2053Compiled.__m2316DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:13:42.821 : INFO: instruction count: 58: __C2053Compiled.__m2317INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:13:42.821 : INFO: instruction count: 17: __C2053Compiled.__m2318INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:13:42.821 : INFO: instruction count: 31: __C2053Compiled.__m2319INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:42.821 : INFO: instruction count: 27: __C2053Compiled.__m2320DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:42.821 : INFO: instruction count: 58: __C2053Compiled.__m2321INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:42.821 : INFO: instruction count: 26: __C2053Compiled.__m2322INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:13:42.821 : INFO: instruction count: 58: __C2053Compiled.__m2323INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:13:42.821 : INFO: instruction count: 10: __C2053Compiled.__m2324INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:13:42.821 : INFO: instruction count: 3: __C2279staticWrapperClass_1. 2023-04-22 21:13:42.845 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.5M blocks / 512.0K chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:42.962 : INFO: executing D-Array [table_aggregate_singlestage] with 16 tasks 2023-04-22 21:13:42.962 MemoryStore: INFO: Block broadcast_171 stored as values in memory (estimated size 72.0 B, free 25.1 GiB) 2023-04-22 21:13:42.964 MemoryStore: INFO: Block broadcast_171_piece0 stored as bytes in memory (estimated size 61.0 B, free 25.1 GiB) 2023-04-22 21:13:42.964 BlockManagerInfo: INFO: Added broadcast_171_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 61.0 B, free: 25.3 GiB) 2023-04-22 21:13:42.970 SparkContext: INFO: Created broadcast 171 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:42.971 MemoryStore: INFO: Block broadcast_172 stored as values in memory (estimated size 429.5 KiB, free 25.1 GiB) 2023-04-22 21:13:42.984 MemoryStore: INFO: Block broadcast_172_piece0 stored as bytes in memory (estimated size 32.4 KiB, free 25.1 GiB) 2023-04-22 21:13:42.984 BlockManagerInfo: INFO: Added broadcast_172_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 32.4 KiB, free: 25.3 GiB) 2023-04-22 21:13:42.985 SparkContext: INFO: Created broadcast 172 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:42.993 SparkContext: INFO: Starting job: collect at SparkBackend.scala:368 2023-04-22 21:13:42.993 DAGScheduler: INFO: Got job 44 (collect at SparkBackend.scala:368) with 16 output partitions 2023-04-22 21:13:42.993 DAGScheduler: INFO: Final stage: ResultStage 83 (collect at SparkBackend.scala:368) 2023-04-22 21:13:42.993 DAGScheduler: INFO: Parents of final stage: List() 2023-04-22 21:13:42.993 DAGScheduler: INFO: Missing parents: List() 2023-04-22 21:13:42.994 DAGScheduler: INFO: Submitting ResultStage 83 (SparkBackendComputeRDD[185] at RDD at SparkBackend.scala:784), which has no missing parents 2023-04-22 21:13:42.994 MemoryStore: INFO: Block broadcast_173 stored as values in memory (estimated size 16.5 KiB, free 25.1 GiB) 2023-04-22 21:13:42.996 MemoryStore: INFO: Block broadcast_173_piece0 stored as bytes in memory (estimated size 9.5 KiB, free 25.1 GiB) 2023-04-22 21:13:42.996 BlockManagerInfo: INFO: Added broadcast_173_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 9.5 KiB, free: 25.3 GiB) 2023-04-22 21:13:42.998 SparkContext: INFO: Created broadcast 173 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:13:42.999 DAGScheduler: INFO: Submitting 16 missing tasks from ResultStage 83 (SparkBackendComputeRDD[185] at RDD at SparkBackend.scala:784) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)) 2023-04-22 21:13:42.999 TaskSchedulerImpl: INFO: Adding task set 83.0 with 16 tasks resource profile 0 2023-04-22 21:13:43.000 TaskSetManager: INFO: Starting task 0.0 in stage 83.0 (TID 424) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 26319 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.000 Executor: INFO: Running task 0.0 in stage 83.0 (TID 424) 2023-04-22 21:13:43.001 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 0.0 in stage 83.0 (TID 424) 2023-04-22 21:13:43.007 : INFO: TaskReport: stage=83, partition=0, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.007 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 83.0 (TID 424) 2023-04-22 21:13:43.008 Executor: INFO: Finished task 0.0 in stage 83.0 (TID 424). 787 bytes result sent to driver 2023-04-22 21:13:43.008 TaskSetManager: INFO: Starting task 1.0 in stage 83.0 (TID 425) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 26319 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.008 TaskSetManager: INFO: Finished task 0.0 in stage 83.0 (TID 424) in 9 ms on uger-c010.broadinstitute.org (executor driver) (1/16) 2023-04-22 21:13:43.012 Executor: INFO: Running task 1.0 in stage 83.0 (TID 425) 2023-04-22 21:13:43.013 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 1.0 in stage 83.0 (TID 425) 2023-04-22 21:13:43.014 : INFO: TaskReport: stage=83, partition=1, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.014 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 1.0 in stage 83.0 (TID 425) 2023-04-22 21:13:43.014 Executor: INFO: Finished task 1.0 in stage 83.0 (TID 425). 787 bytes result sent to driver 2023-04-22 21:13:43.015 TaskSetManager: INFO: Starting task 2.0 in stage 83.0 (TID 426) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 26404 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.015 TaskSetManager: INFO: Finished task 1.0 in stage 83.0 (TID 425) in 7 ms on uger-c010.broadinstitute.org (executor driver) (2/16) 2023-04-22 21:13:43.022 Executor: INFO: Running task 2.0 in stage 83.0 (TID 426) 2023-04-22 21:13:43.023 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 2.0 in stage 83.0 (TID 426) 2023-04-22 21:13:43.024 : INFO: TaskReport: stage=83, partition=2, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.024 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 2.0 in stage 83.0 (TID 426) 2023-04-22 21:13:43.024 Executor: INFO: Finished task 2.0 in stage 83.0 (TID 426). 744 bytes result sent to driver 2023-04-22 21:13:43.024 TaskSetManager: INFO: Starting task 3.0 in stage 83.0 (TID 427) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 26319 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.024 TaskSetManager: INFO: Finished task 2.0 in stage 83.0 (TID 426) in 9 ms on uger-c010.broadinstitute.org (executor driver) (3/16) 2023-04-22 21:13:43.025 Executor: INFO: Running task 3.0 in stage 83.0 (TID 427) 2023-04-22 21:13:43.025 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 3.0 in stage 83.0 (TID 427) 2023-04-22 21:13:43.026 : INFO: TaskReport: stage=83, partition=3, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.026 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 3.0 in stage 83.0 (TID 427) 2023-04-22 21:13:43.026 Executor: INFO: Finished task 3.0 in stage 83.0 (TID 427). 744 bytes result sent to driver 2023-04-22 21:13:43.027 TaskSetManager: INFO: Starting task 4.0 in stage 83.0 (TID 428) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 26404 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.027 TaskSetManager: INFO: Finished task 3.0 in stage 83.0 (TID 427) in 3 ms on uger-c010.broadinstitute.org (executor driver) (4/16) 2023-04-22 21:13:43.031 Executor: INFO: Running task 4.0 in stage 83.0 (TID 428) 2023-04-22 21:13:43.031 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 4.0 in stage 83.0 (TID 428) 2023-04-22 21:13:43.032 : INFO: TaskReport: stage=83, partition=4, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.032 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 4.0 in stage 83.0 (TID 428) 2023-04-22 21:13:43.032 Executor: INFO: Finished task 4.0 in stage 83.0 (TID 428). 744 bytes result sent to driver 2023-04-22 21:13:43.032 TaskSetManager: INFO: Starting task 5.0 in stage 83.0 (TID 429) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 26319 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.033 TaskSetManager: INFO: Finished task 4.0 in stage 83.0 (TID 428) in 6 ms on uger-c010.broadinstitute.org (executor driver) (5/16) 2023-04-22 21:13:43.035 Executor: INFO: Running task 5.0 in stage 83.0 (TID 429) 2023-04-22 21:13:43.036 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 5.0 in stage 83.0 (TID 429) 2023-04-22 21:13:43.036 : INFO: TaskReport: stage=83, partition=5, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.036 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 5.0 in stage 83.0 (TID 429) 2023-04-22 21:13:43.036 Executor: INFO: Finished task 5.0 in stage 83.0 (TID 429). 744 bytes result sent to driver 2023-04-22 21:13:43.037 TaskSetManager: INFO: Starting task 6.0 in stage 83.0 (TID 430) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 26404 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.037 TaskSetManager: INFO: Finished task 5.0 in stage 83.0 (TID 429) in 5 ms on uger-c010.broadinstitute.org (executor driver) (6/16) 2023-04-22 21:13:43.038 Executor: INFO: Running task 6.0 in stage 83.0 (TID 430) 2023-04-22 21:13:43.038 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 6.0 in stage 83.0 (TID 430) 2023-04-22 21:13:43.039 : INFO: TaskReport: stage=83, partition=6, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.039 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 83.0 (TID 430) 2023-04-22 21:13:43.039 Executor: INFO: Finished task 6.0 in stage 83.0 (TID 430). 744 bytes result sent to driver 2023-04-22 21:13:43.039 TaskSetManager: INFO: Starting task 7.0 in stage 83.0 (TID 431) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 26319 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.039 TaskSetManager: INFO: Finished task 6.0 in stage 83.0 (TID 430) in 2 ms on uger-c010.broadinstitute.org (executor driver) (7/16) 2023-04-22 21:13:43.048 Executor: INFO: Running task 7.0 in stage 83.0 (TID 431) 2023-04-22 21:13:43.049 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 7.0 in stage 83.0 (TID 431) 2023-04-22 21:13:43.049 : INFO: TaskReport: stage=83, partition=7, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.049 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 7.0 in stage 83.0 (TID 431) 2023-04-22 21:13:43.049 Executor: INFO: Finished task 7.0 in stage 83.0 (TID 431). 744 bytes result sent to driver 2023-04-22 21:13:43.050 TaskSetManager: INFO: Starting task 8.0 in stage 83.0 (TID 432) (uger-c010.broadinstitute.org, executor driver, partition 8, PROCESS_LOCAL, 26319 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.050 TaskSetManager: INFO: Finished task 7.0 in stage 83.0 (TID 431) in 11 ms on uger-c010.broadinstitute.org (executor driver) (8/16) 2023-04-22 21:13:43.063 Executor: INFO: Running task 8.0 in stage 83.0 (TID 432) 2023-04-22 21:13:43.063 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 8.0 in stage 83.0 (TID 432) 2023-04-22 21:13:43.064 : INFO: TaskReport: stage=83, partition=8, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.064 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 83.0 (TID 432) 2023-04-22 21:13:43.064 Executor: INFO: Finished task 8.0 in stage 83.0 (TID 432). 744 bytes result sent to driver 2023-04-22 21:13:43.064 TaskSetManager: INFO: Starting task 9.0 in stage 83.0 (TID 433) (uger-c010.broadinstitute.org, executor driver, partition 9, PROCESS_LOCAL, 26404 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.065 TaskSetManager: INFO: Finished task 8.0 in stage 83.0 (TID 432) in 15 ms on uger-c010.broadinstitute.org (executor driver) (9/16) 2023-04-22 21:13:43.065 Executor: INFO: Running task 9.0 in stage 83.0 (TID 433) 2023-04-22 21:13:43.066 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 9.0 in stage 83.0 (TID 433) 2023-04-22 21:13:43.066 : INFO: TaskReport: stage=83, partition=9, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.066 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 9.0 in stage 83.0 (TID 433) 2023-04-22 21:13:43.066 Executor: INFO: Finished task 9.0 in stage 83.0 (TID 433). 744 bytes result sent to driver 2023-04-22 21:13:43.066 TaskSetManager: INFO: Starting task 10.0 in stage 83.0 (TID 434) (uger-c010.broadinstitute.org, executor driver, partition 10, PROCESS_LOCAL, 26319 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.067 TaskSetManager: INFO: Finished task 9.0 in stage 83.0 (TID 433) in 3 ms on uger-c010.broadinstitute.org (executor driver) (10/16) 2023-04-22 21:13:43.067 Executor: INFO: Running task 10.0 in stage 83.0 (TID 434) 2023-04-22 21:13:43.068 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 10.0 in stage 83.0 (TID 434) 2023-04-22 21:13:43.068 : INFO: TaskReport: stage=83, partition=10, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.068 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 83.0 (TID 434) 2023-04-22 21:13:43.069 Executor: INFO: Finished task 10.0 in stage 83.0 (TID 434). 744 bytes result sent to driver 2023-04-22 21:13:43.069 TaskSetManager: INFO: Starting task 11.0 in stage 83.0 (TID 435) (uger-c010.broadinstitute.org, executor driver, partition 11, PROCESS_LOCAL, 26404 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.069 TaskSetManager: INFO: Finished task 10.0 in stage 83.0 (TID 434) in 3 ms on uger-c010.broadinstitute.org (executor driver) (11/16) 2023-04-22 21:13:43.070 Executor: INFO: Running task 11.0 in stage 83.0 (TID 435) 2023-04-22 21:13:43.071 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 11.0 in stage 83.0 (TID 435) 2023-04-22 21:13:43.071 : INFO: TaskReport: stage=83, partition=11, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.071 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 11.0 in stage 83.0 (TID 435) 2023-04-22 21:13:43.071 Executor: INFO: Finished task 11.0 in stage 83.0 (TID 435). 744 bytes result sent to driver 2023-04-22 21:13:43.072 TaskSetManager: INFO: Starting task 12.0 in stage 83.0 (TID 436) (uger-c010.broadinstitute.org, executor driver, partition 12, PROCESS_LOCAL, 26319 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.072 TaskSetManager: INFO: Finished task 11.0 in stage 83.0 (TID 435) in 3 ms on uger-c010.broadinstitute.org (executor driver) (12/16) 2023-04-22 21:13:43.072 Executor: INFO: Running task 12.0 in stage 83.0 (TID 436) 2023-04-22 21:13:43.073 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 12.0 in stage 83.0 (TID 436) 2023-04-22 21:13:43.073 : INFO: TaskReport: stage=83, partition=12, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.073 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 83.0 (TID 436) 2023-04-22 21:13:43.074 Executor: INFO: Finished task 12.0 in stage 83.0 (TID 436). 744 bytes result sent to driver 2023-04-22 21:13:43.074 TaskSetManager: INFO: Starting task 13.0 in stage 83.0 (TID 437) (uger-c010.broadinstitute.org, executor driver, partition 13, PROCESS_LOCAL, 26404 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.074 TaskSetManager: INFO: Finished task 12.0 in stage 83.0 (TID 436) in 2 ms on uger-c010.broadinstitute.org (executor driver) (13/16) 2023-04-22 21:13:43.074 Executor: INFO: Running task 13.0 in stage 83.0 (TID 437) 2023-04-22 21:13:43.075 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 13.0 in stage 83.0 (TID 437) 2023-04-22 21:13:43.075 : INFO: TaskReport: stage=83, partition=13, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.075 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 13.0 in stage 83.0 (TID 437) 2023-04-22 21:13:43.076 Executor: INFO: Finished task 13.0 in stage 83.0 (TID 437). 744 bytes result sent to driver 2023-04-22 21:13:43.076 TaskSetManager: INFO: Starting task 14.0 in stage 83.0 (TID 438) (uger-c010.broadinstitute.org, executor driver, partition 14, PROCESS_LOCAL, 26319 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.076 TaskSetManager: INFO: Finished task 13.0 in stage 83.0 (TID 437) in 2 ms on uger-c010.broadinstitute.org (executor driver) (14/16) 2023-04-22 21:13:43.081 Executor: INFO: Running task 14.0 in stage 83.0 (TID 438) 2023-04-22 21:13:43.082 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 14.0 in stage 83.0 (TID 438) 2023-04-22 21:13:43.082 : INFO: TaskReport: stage=83, partition=14, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.082 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 14.0 in stage 83.0 (TID 438) 2023-04-22 21:13:43.082 Executor: INFO: Finished task 14.0 in stage 83.0 (TID 438). 744 bytes result sent to driver 2023-04-22 21:13:43.082 TaskSetManager: INFO: Starting task 15.0 in stage 83.0 (TID 439) (uger-c010.broadinstitute.org, executor driver, partition 15, PROCESS_LOCAL, 26404 bytes) taskResourceAssignments Map() 2023-04-22 21:13:43.083 TaskSetManager: INFO: Finished task 14.0 in stage 83.0 (TID 438) in 7 ms on uger-c010.broadinstitute.org (executor driver) (15/16) 2023-04-22 21:13:43.083 Executor: INFO: Running task 15.0 in stage 83.0 (TID 439) 2023-04-22 21:13:43.084 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 15.0 in stage 83.0 (TID 439) 2023-04-22 21:13:43.084 : INFO: TaskReport: stage=83, partition=15, attempt=0, peakBytes=196608, peakBytesReadable=192.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:43.084 : INFO: RegionPool: FREE: 192.0K allocated (192.0K blocks / 0 chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 15.0 in stage 83.0 (TID 439) 2023-04-22 21:13:43.085 Executor: INFO: Finished task 15.0 in stage 83.0 (TID 439). 744 bytes result sent to driver 2023-04-22 21:13:43.085 TaskSetManager: INFO: Finished task 15.0 in stage 83.0 (TID 439) in 3 ms on uger-c010.broadinstitute.org (executor driver) (16/16) 2023-04-22 21:13:43.085 TaskSchedulerImpl: INFO: Removed TaskSet 83.0, whose tasks have all completed, from pool 2023-04-22 21:13:43.085 DAGScheduler: INFO: ResultStage 83 (collect at SparkBackend.scala:368) finished in 0.091 s 2023-04-22 21:13:43.085 DAGScheduler: INFO: Job 44 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:13:43.085 TaskSchedulerImpl: INFO: Killing all running tasks in stage 83: Stage finished 2023-04-22 21:13:43.086 DAGScheduler: INFO: Job 44 finished: collect at SparkBackend.scala:368, took 0.092942 s 2023-04-22 21:13:43.086 : INFO: executed D-Array [table_aggregate_singlestage] in 123.871ms 2023-04-22 21:13:43.086 : INFO: took 2.467s 2023-04-22 21:13:43.086 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (I64 0) 2023-04-22 21:13:43.087 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (I64 0) 2023-04-22 21:13:43.087 : INFO: after EvalRelationalLets: IR size 2: (MakeTuple (0) (I64 0)) 2023-04-22 21:13:43.087 : INFO: after LowerAndExecuteShuffles: IR size 2: (MakeTuple (0) (I64 0)) 2023-04-22 21:13:43.088 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:13:43.088 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:13:43.088 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (Literal Tuple[Int64] ) 2023-04-22 21:13:43.088 : INFO: initial IR: IR size 2: (MakeTuple (0) (Literal Tuple[Int64] )) 2023-04-22 21:13:43.089 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.089 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.089 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.089 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.089 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.091 : INFO: encoder cache miss (35 hits, 17 misses, 0.673) 2023-04-22 21:13:43.093 : INFO: instruction count: 3: __C2339HailClassLoaderContainer. 2023-04-22 21:13:43.093 : INFO: instruction count: 3: __C2339HailClassLoaderContainer. 2023-04-22 21:13:43.093 : INFO: instruction count: 3: __C2341FSContainer. 2023-04-22 21:13:43.093 : INFO: instruction count: 3: __C2341FSContainer. 2023-04-22 21:13:43.093 : INFO: instruction count: 3: __C2343etypeEncode. 2023-04-22 21:13:43.093 : INFO: instruction count: 7: __C2343etypeEncode.apply 2023-04-22 21:13:43.093 : INFO: instruction count: 9: __C2343etypeEncode.__m2345ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_int64ENDENDEND 2023-04-22 21:13:43.093 : INFO: instruction count: 9: __C2343etypeEncode.__m2346ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_int64ENDEND 2023-04-22 21:13:43.093 : INFO: instruction count: 13: __C2343etypeEncode.__m2347ENCODE_SBaseStructPointer_TO_r_struct_of_r_int64END 2023-04-22 21:13:43.093 : INFO: instruction count: 4: __C2343etypeEncode.__m2348ENCODE_SInt64$_TO_r_int64 2023-04-22 21:13:43.095 MemoryStore: INFO: Block broadcast_174 stored as values in memory (estimated size 104.0 B, free 25.1 GiB) 2023-04-22 21:13:43.097 MemoryStore: INFO: Block broadcast_174_piece0 stored as bytes in memory (estimated size 60.0 B, free 25.1 GiB) 2023-04-22 21:13:43.098 BlockManagerInfo: INFO: Added broadcast_174_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 60.0 B, free: 25.3 GiB) 2023-04-22 21:13:43.098 SparkContext: INFO: Created broadcast 174 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:43.098 : INFO: instruction count: 3: __C2325HailClassLoaderContainer. 2023-04-22 21:13:43.098 : INFO: instruction count: 3: __C2325HailClassLoaderContainer. 2023-04-22 21:13:43.098 : INFO: instruction count: 3: __C2327FSContainer. 2023-04-22 21:13:43.098 : INFO: instruction count: 3: __C2327FSContainer. 2023-04-22 21:13:43.099 : INFO: instruction count: 3: __C2329Compiled. 2023-04-22 21:13:43.099 : INFO: instruction count: 7: __C2329Compiled.apply 2023-04-22 21:13:43.099 : INFO: instruction count: 9: __C2329Compiled.setPartitionIndex 2023-04-22 21:13:43.099 : INFO: instruction count: 4: __C2329Compiled.addPartitionRegion 2023-04-22 21:13:43.099 : INFO: instruction count: 4: __C2329Compiled.setPool 2023-04-22 21:13:43.099 : INFO: instruction count: 3: __C2329Compiled.addHailClassLoader 2023-04-22 21:13:43.099 : INFO: instruction count: 3: __C2329Compiled.addFS 2023-04-22 21:13:43.099 : INFO: instruction count: 4: __C2329Compiled.addTaskContext 2023-04-22 21:13:43.099 : INFO: instruction count: 41: __C2329Compiled.addAndDecodeLiterals 2023-04-22 21:13:43.099 : INFO: instruction count: 27: __C2329Compiled.__m2335DECODE_r_struct_of_r_struct_of_r_struct_of_r_int64ENDENDEND_TO_SBaseStructPointer 2023-04-22 21:13:43.099 : INFO: instruction count: 17: __C2329Compiled.__m2336INPLACE_DECODE_r_struct_of_r_struct_of_r_int64ENDEND_TO_r_tuple_of_r_tuple_of_r_int64ENDEND 2023-04-22 21:13:43.099 : INFO: instruction count: 17: __C2329Compiled.__m2337INPLACE_DECODE_r_struct_of_r_int64END_TO_r_tuple_of_r_int64END 2023-04-22 21:13:43.099 : INFO: instruction count: 10: __C2329Compiled.__m2338INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:13:43.100 : INFO: initial IR: IR size 2: (MakeTuple (0) (Literal Tuple[Int64] )) 2023-04-22 21:13:43.100 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.100 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.100 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.100 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.100 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.102 : INFO: encoder cache hit 2023-04-22 21:13:43.103 MemoryStore: INFO: Block broadcast_175 stored as values in memory (estimated size 104.0 B, free 25.1 GiB) 2023-04-22 21:13:43.104 MemoryStore: INFO: Block broadcast_175_piece0 stored as bytes in memory (estimated size 60.0 B, free 25.1 GiB) 2023-04-22 21:13:43.104 BlockManagerInfo: INFO: Added broadcast_175_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 60.0 B, free: 25.3 GiB) 2023-04-22 21:13:43.104 SparkContext: INFO: Created broadcast 175 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:43.104 : INFO: instruction count: 3: __C2349HailClassLoaderContainer. 2023-04-22 21:13:43.104 : INFO: instruction count: 3: __C2349HailClassLoaderContainer. 2023-04-22 21:13:43.105 : INFO: instruction count: 3: __C2351FSContainer. 2023-04-22 21:13:43.105 : INFO: instruction count: 3: __C2351FSContainer. 2023-04-22 21:13:43.105 : INFO: instruction count: 3: __C2353Compiled. 2023-04-22 21:13:43.105 : INFO: instruction count: 7: __C2353Compiled.apply 2023-04-22 21:13:43.105 : INFO: instruction count: 9: __C2353Compiled.setPartitionIndex 2023-04-22 21:13:43.105 : INFO: instruction count: 4: __C2353Compiled.addPartitionRegion 2023-04-22 21:13:43.105 : INFO: instruction count: 4: __C2353Compiled.setPool 2023-04-22 21:13:43.105 : INFO: instruction count: 3: __C2353Compiled.addHailClassLoader 2023-04-22 21:13:43.105 : INFO: instruction count: 3: __C2353Compiled.addFS 2023-04-22 21:13:43.105 : INFO: instruction count: 4: __C2353Compiled.addTaskContext 2023-04-22 21:13:43.106 : INFO: instruction count: 41: __C2353Compiled.addAndDecodeLiterals 2023-04-22 21:13:43.106 : INFO: instruction count: 27: __C2353Compiled.__m2359DECODE_r_struct_of_r_struct_of_r_struct_of_r_int64ENDENDEND_TO_SBaseStructPointer 2023-04-22 21:13:43.106 : INFO: instruction count: 17: __C2353Compiled.__m2360INPLACE_DECODE_r_struct_of_r_struct_of_r_int64ENDEND_TO_r_tuple_of_r_tuple_of_r_int64ENDEND 2023-04-22 21:13:43.106 : INFO: instruction count: 17: __C2353Compiled.__m2361INPLACE_DECODE_r_struct_of_r_int64END_TO_r_tuple_of_r_int64END 2023-04-22 21:13:43.106 : INFO: instruction count: 10: __C2353Compiled.__m2362INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:13:43.106 : INFO: initial IR: IR size 2: (MakeTuple (0) (Literal Tuple[Int64] )) 2023-04-22 21:13:43.106 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.106 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.106 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.107 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.107 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Tuple[Int64]] ) 2023-04-22 21:13:43.108 : INFO: encoder cache hit 2023-04-22 21:13:43.109 MemoryStore: INFO: Block broadcast_176 stored as values in memory (estimated size 104.0 B, free 25.1 GiB) 2023-04-22 21:13:43.110 MemoryStore: INFO: Block broadcast_176_piece0 stored as bytes in memory (estimated size 60.0 B, free 25.1 GiB) 2023-04-22 21:13:43.111 BlockManagerInfo: INFO: Added broadcast_176_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 60.0 B, free: 25.3 GiB) 2023-04-22 21:13:43.111 SparkContext: INFO: Created broadcast 176 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:43.111 : INFO: instruction count: 3: __C2363HailClassLoaderContainer. 2023-04-22 21:13:43.111 : INFO: instruction count: 3: __C2363HailClassLoaderContainer. 2023-04-22 21:13:43.111 : INFO: instruction count: 3: __C2365FSContainer. 2023-04-22 21:13:43.111 : INFO: instruction count: 3: __C2365FSContainer. 2023-04-22 21:13:43.112 : INFO: instruction count: 3: __C2367Compiled. 2023-04-22 21:13:43.112 : INFO: instruction count: 7: __C2367Compiled.apply 2023-04-22 21:13:43.112 : INFO: instruction count: 9: __C2367Compiled.setPartitionIndex 2023-04-22 21:13:43.112 : INFO: instruction count: 4: __C2367Compiled.addPartitionRegion 2023-04-22 21:13:43.112 : INFO: instruction count: 4: __C2367Compiled.setPool 2023-04-22 21:13:43.112 : INFO: instruction count: 3: __C2367Compiled.addHailClassLoader 2023-04-22 21:13:43.112 : INFO: instruction count: 3: __C2367Compiled.addFS 2023-04-22 21:13:43.112 : INFO: instruction count: 4: __C2367Compiled.addTaskContext 2023-04-22 21:13:43.112 : INFO: instruction count: 41: __C2367Compiled.addAndDecodeLiterals 2023-04-22 21:13:43.112 : INFO: instruction count: 27: __C2367Compiled.__m2373DECODE_r_struct_of_r_struct_of_r_struct_of_r_int64ENDENDEND_TO_SBaseStructPointer 2023-04-22 21:13:43.112 : INFO: instruction count: 17: __C2367Compiled.__m2374INPLACE_DECODE_r_struct_of_r_struct_of_r_int64ENDEND_TO_r_tuple_of_r_tuple_of_r_int64ENDEND 2023-04-22 21:13:43.112 : INFO: instruction count: 17: __C2367Compiled.__m2375INPLACE_DECODE_r_struct_of_r_int64END_TO_r_tuple_of_r_int64END 2023-04-22 21:13:43.113 : INFO: instruction count: 10: __C2367Compiled.__m2376INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:13:43.114 : INFO: encoder cache miss (37 hits, 18 misses, 0.673) 2023-04-22 21:13:43.115 : INFO: instruction count: 3: __C2377HailClassLoaderContainer. 2023-04-22 21:13:43.115 : INFO: instruction count: 3: __C2377HailClassLoaderContainer. 2023-04-22 21:13:43.115 : INFO: instruction count: 3: __C2379FSContainer. 2023-04-22 21:13:43.115 : INFO: instruction count: 3: __C2379FSContainer. 2023-04-22 21:13:43.115 : INFO: instruction count: 3: __C2381etypeEncode. 2023-04-22 21:13:43.115 : INFO: instruction count: 7: __C2381etypeEncode.apply 2023-04-22 21:13:43.115 : INFO: instruction count: 21: __C2381etypeEncode.__m2383ENCODE_SBaseStructPointer_TO_o_struct_of_o_int64END 2023-04-22 21:13:43.115 : INFO: instruction count: 4: __C2381etypeEncode.__m2384ENCODE_SInt64$_TO_o_int64 2023-04-22 21:13:43.116 : INFO: finished execution of query hail_query_5, result size is 9.00 B 2023-04-22 21:13:43.116 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:43.117 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:13:43.117 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:43.117 : INFO: RegionPool: FREE: 2.5M allocated (1.7M blocks / 880.0K chunks), regions.size = 3, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode total 4.261s self 15.832ms children 4.245s %children 99.63% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR total 74.203ms self 0.009ms children 74.194ms %children 99.99% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation total 74.117ms self 0.034ms children 74.082ms %children 99.95% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 74.082ms self 0.142ms children 73.940ms %children 99.81% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.634ms self 0.634ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.517ms self 0.517ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.154ms self 1.154ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.580ms self 1.580ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 2.072ms self 2.072ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.139ms self 0.139ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 40.450ms self 40.450ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.159ms self 0.159ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.331ms self 0.331ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.968ms self 0.968ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.421ms self 0.421ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 1.344ms self 1.344ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.131ms self 0.131ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.629ms self 1.629ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.304ms self 0.304ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.624ms self 0.624ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.294ms self 0.294ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 1.284ms self 1.284ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.125ms self 0.125ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 19.641ms self 19.641ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.054ms self 0.054ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable total 30.403ms self 0.007ms children 30.395ms %children 99.98% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/LoweringTransformation total 30.069ms self 30.069ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.313ms self 0.313ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable total 103.110ms self 0.008ms children 103.102ms %children 99.99% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 102.977ms self 0.034ms children 102.943ms %children 99.97% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 102.943ms self 0.064ms children 102.879ms %children 99.94% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.252ms self 0.252ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.805ms self 0.805ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.270ms self 1.270ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.117 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 52.992ms self 52.992ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 2.245ms self 2.245ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.177ms self 0.177ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 2.633ms self 2.633ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.300ms self 0.300ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.342ms self 0.342ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.691ms self 0.691ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.823ms self 0.823ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 11.064ms self 11.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.190ms self 0.190ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 2.367ms self 2.367ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.305ms self 0.305ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.336ms self 0.336ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.668ms self 0.668ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.824ms self 0.824ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 1.961ms self 1.961ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.194ms self 0.194ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 22.442ms self 22.442ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.093ms self 0.093ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets total 0.376ms self 0.005ms children 0.370ms %children 98.57% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.276ms self 0.276ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets total 4.011s self 0.011ms children 4.011s %children 100.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation total 4.011s self 12.511ms children 3.998s %children 99.69% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles total 0.032ms self 0.004ms children 0.028ms %children 86.90% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.561ms self 0.005ms children 0.555ms %children 99.03% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.533ms self 0.019ms children 0.514ms %children 96.35% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.514ms self 0.021ms children 0.493ms %children 95.87% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.042ms self 0.042ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.044ms self 0.044ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.149ms self 0.149ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.159ms self 0.159ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable total 973.419ms self 0.007ms children 973.412ms %children 100.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 973.396ms self 774.225ms children 199.171ms %children 20.46% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR total 0.681ms self 0.009ms children 0.673ms %children 98.71% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation total 0.658ms self 0.019ms children 0.639ms %children 97.14% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 0.639ms self 0.025ms children 0.614ms %children 96.12% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.118 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.104ms self 0.104ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.166ms self 0.166ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable total 0.040ms self 0.004ms children 0.036ms %children 89.50% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/LoweringTransformation total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable total 0.177ms self 0.003ms children 0.174ms %children 98.15% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 0.171ms self 0.009ms children 0.162ms %children 94.52% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 0.162ms self 0.012ms children 0.150ms %children 92.86% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets total 0.027ms self 0.003ms children 0.024ms %children 88.17% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.119 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets total 0.017ms self 0.002ms children 0.014ms %children 85.15% 2023-04-22 21:13:43.128 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.128 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.128 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.128 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles total 0.020ms self 0.002ms children 0.018ms %children 88.60% 2023-04-22 21:13:43.128 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 1.849ms self 0.007ms children 1.842ms %children 99.63% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 1.624ms self 1.624ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.213ms self 0.035ms children 0.178ms %children 83.62% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.178ms self 0.017ms children 0.161ms %children 90.30% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable total 0.039ms self 0.004ms children 0.035ms %children 89.81% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.139ms self 0.003ms children 0.135ms %children 97.64% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.133ms self 0.008ms children 0.125ms %children 94.16% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.125ms self 0.009ms children 0.115ms %children 92.51% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.129 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile total 192.588ms self 151.027ms children 41.561ms %children 21.58% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.314ms self 0.005ms children 0.309ms %children 98.44% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.302ms self 0.019ms children 0.283ms %children 93.87% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.283ms self 0.017ms children 0.266ms %children 93.85% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.093ms self 0.093ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.025ms self 0.004ms children 0.021ms %children 84.91% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.129ms self 0.003ms children 0.126ms %children 97.63% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.123ms self 0.007ms children 0.117ms %children 94.59% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.117ms self 0.010ms children 0.106ms %children 91.21% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.130 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.131 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.131 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.131 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.131 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 30.041ms self 0.006ms children 30.036ms %children 99.98% 2023-04-22 21:13:43.131 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.131 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 30.020ms self 30.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.131 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.131 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.152ms self 0.004ms children 0.149ms %children 97.49% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.145ms self 0.008ms children 0.137ms %children 94.48% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.137ms self 0.012ms children 0.125ms %children 91.10% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 2.010ms self 2.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.280ms self 0.004ms children 0.276ms %children 98.55% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.269ms self 0.008ms children 0.261ms %children 96.89% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.261ms self 0.016ms children 0.245ms %children 93.76% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.068ms self 0.068ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.018ms self 0.003ms children 0.015ms %children 83.19% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.132 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.117ms self 0.003ms children 0.114ms %children 97.33% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.111ms self 0.007ms children 0.105ms %children 94.10% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.105ms self 0.009ms children 0.096ms %children 91.63% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 1.961ms self 0.004ms children 1.957ms %children 99.79% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 1.945ms self 1.945ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.133ms self 0.003ms children 0.130ms %children 97.41% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.126ms self 0.007ms children 0.119ms %children 94.56% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.119ms self 0.009ms children 0.110ms %children 92.15% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 1.951ms self 1.951ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.274ms self 0.004ms children 0.269ms %children 98.41% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.263ms self 0.008ms children 0.255ms %children 96.79% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.255ms self 0.016ms children 0.238ms %children 93.64% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.134 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.017ms self 0.003ms children 0.014ms %children 81.61% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.107ms self 0.003ms children 0.104ms %children 97.29% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.101ms self 0.007ms children 0.095ms %children 93.48% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.095ms self 0.009ms children 0.086ms %children 91.02% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 1.950ms self 0.004ms children 1.946ms %children 99.79% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 1.938ms self 1.938ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.136ms self 0.004ms children 0.131ms %children 96.85% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.127ms self 0.007ms children 0.119ms %children 94.12% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.119ms self 0.010ms children 0.109ms %children 91.38% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 1.947ms self 1.947ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/InitializeCompiledFunction total 3.590ms self 3.590ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/RunCompiledFunction total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 2.831ms self 0.005ms children 2.826ms %children 99.83% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 2.820ms self 0.019ms children 2.801ms %children 99.31% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 2.801ms self 0.014ms children 2.786ms %children 99.49% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.135 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 2.659ms self 2.659ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.048ms self 0.048ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR total 0.197ms self 0.004ms children 0.194ms %children 98.12% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation total 0.190ms self 0.007ms children 0.183ms %children 96.15% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 0.183ms self 0.010ms children 0.173ms %children 94.74% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.075ms self 0.075ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable total 2.528ms self 2.479ms children 0.049ms %children 1.93% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable/Verify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable/LoweringTransformation total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable total 0.145ms self 0.010ms children 0.135ms %children 93.13% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 0.131ms self 0.008ms children 0.123ms %children 94.25% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 0.123ms self 0.010ms children 0.113ms %children 91.60% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets total 0.029ms self 0.003ms children 0.026ms %children 88.02% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets total 0.018ms self 0.002ms children 0.015ms %children 86.10% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets/LoweringTransformation total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles total 0.017ms self 0.002ms children 0.015ms %children 87.19% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.119ms self 0.003ms children 0.116ms %children 97.81% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.113ms self 0.007ms children 0.106ms %children 93.94% 2023-04-22 21:13:43.136 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.106ms self 0.009ms children 0.097ms %children 91.25% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable total 0.026ms self 0.003ms children 0.023ms %children 88.02% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.108ms self 0.003ms children 0.106ms %children 97.56% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.103ms self 0.007ms children 0.097ms %children 93.64% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.097ms self 0.009ms children 0.088ms %children 91.19% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile total 97.959ms self 93.366ms children 4.593ms %children 4.69% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.465ms self 0.005ms children 0.460ms %children 98.86% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.447ms self 0.014ms children 0.433ms %children 96.85% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.433ms self 0.015ms children 0.418ms %children 96.46% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.097ms self 0.097ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.119ms self 0.119ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR total 0.023ms self 0.005ms children 0.019ms %children 80.34% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.316ms self 0.004ms children 0.312ms %children 98.59% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.300ms self 0.012ms children 0.288ms %children 96.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.288ms self 0.013ms children 0.275ms %children 95.46% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.043ms self 0.043ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.137 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.277ms self 0.005ms children 0.271ms %children 98.09% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.260ms self 0.260ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.302ms self 0.005ms children 0.296ms %children 98.22% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.286ms self 0.012ms children 0.274ms %children 95.67% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.274ms self 0.013ms children 0.261ms %children 95.35% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.075ms self 0.075ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/EmitContext.analyze total 0.241ms self 0.241ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.421ms self 0.006ms children 0.415ms %children 98.68% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.389ms self 0.015ms children 0.373ms %children 96.05% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.373ms self 0.030ms children 0.343ms %children 91.84% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.106ms self 0.106ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR total 0.043ms self 0.005ms children 0.038ms %children 88.46% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.302ms self 0.005ms children 0.298ms %children 98.50% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.288ms self 0.012ms children 0.276ms %children 95.68% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.276ms self 0.013ms children 0.262ms %children 95.15% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.042ms self 0.042ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.252ms self 0.006ms children 0.246ms %children 97.74% 2023-04-22 21:13:43.138 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.234ms self 0.234ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.297ms self 0.005ms children 0.292ms %children 98.24% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.282ms self 0.012ms children 0.270ms %children 95.58% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.270ms self 0.013ms children 0.257ms %children 95.17% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/EmitContext.analyze total 0.242ms self 0.242ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.332ms self 0.005ms children 0.327ms %children 98.42% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.316ms self 0.013ms children 0.303ms %children 95.80% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.303ms self 0.014ms children 0.288ms %children 95.24% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.050ms self 0.050ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.078ms self 0.078ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR total 0.022ms self 0.004ms children 0.018ms %children 80.37% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.303ms self 0.005ms children 0.298ms %children 98.49% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.289ms self 0.012ms children 0.276ms %children 95.75% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.276ms self 0.015ms children 0.261ms %children 94.54% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.077ms self 0.077ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.232ms self 0.005ms children 0.227ms %children 97.84% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.213ms self 0.213ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.305ms self 0.005ms children 0.300ms %children 98.32% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.291ms self 0.012ms children 0.279ms %children 95.81% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.279ms self 0.013ms children 0.266ms %children 95.34% 2023-04-22 21:13:43.139 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.058ms self 0.058ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/EmitContext.analyze total 0.218ms self 0.218ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/InitializeCompiledFunction total 3.457ms self 3.457ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/RunCompiledFunction total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles total 0.035ms self 0.005ms children 0.030ms %children 86.26% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.778ms self 0.005ms children 0.773ms %children 99.32% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.763ms self 0.010ms children 0.753ms %children 98.73% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.753ms self 0.022ms children 0.731ms %children 97.08% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.125ms self 0.125ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.076ms self 0.076ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.139ms self 0.139ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.250ms self 0.250ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable total 282.852ms self 0.009ms children 282.843ms %children 100.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 282.827ms self 59.517ms children 223.310ms %children 78.96% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR total 1.598ms self 0.006ms children 1.593ms %children 99.65% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation total 1.574ms self 0.012ms children 1.562ms %children 99.26% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 1.562ms self 0.028ms children 1.535ms %children 98.22% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.199ms self 0.199ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.111ms self 0.111ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.285ms self 0.285ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.200ms self 0.200ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.089ms self 0.089ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.054ms self 0.054ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.160ms self 0.160ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.165ms self 0.165ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable total 0.054ms self 0.005ms children 0.049ms %children 90.47% 2023-04-22 21:13:43.140 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/LoweringTransformation total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable total 0.663ms self 0.007ms children 0.656ms %children 98.97% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 0.641ms self 0.011ms children 0.631ms %children 98.35% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 0.631ms self 0.015ms children 0.616ms %children 97.62% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.076ms self 0.076ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.056ms self 0.056ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.167ms self 0.167ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.168ms self 0.168ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets total 0.050ms self 0.005ms children 0.045ms %children 89.30% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets total 0.035ms self 0.005ms children 0.030ms %children 85.67% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/LoweringTransformation total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles total 0.036ms self 0.005ms children 0.031ms %children 86.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.670ms self 0.006ms children 0.664ms %children 99.13% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.650ms self 0.010ms children 0.640ms %children 98.48% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.640ms self 0.014ms children 0.626ms %children 97.78% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.098ms self 0.098ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.183ms self 0.183ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.176ms self 0.176ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable total 0.053ms self 0.005ms children 0.048ms %children 90.55% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.660ms self 0.006ms children 0.654ms %children 99.14% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.640ms self 0.011ms children 0.629ms %children 98.26% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.629ms self 0.015ms children 0.613ms %children 97.57% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.099ms self 0.099ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.054ms self 0.054ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.141 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.171ms self 0.171ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.167ms self 0.167ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile total 166.791ms self 156.146ms children 10.645ms %children 6.38% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.867ms self 0.006ms children 0.862ms %children 99.33% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.841ms self 0.015ms children 0.825ms %children 98.19% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.825ms self 0.018ms children 0.808ms %children 97.85% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.092ms self 0.092ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.126ms self 0.126ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.206ms self 0.206ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.253ms self 0.253ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.034ms self 0.005ms children 0.029ms %children 85.20% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.794ms self 0.006ms children 0.789ms %children 99.26% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.770ms self 0.015ms children 0.755ms %children 98.08% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.755ms self 0.017ms children 0.738ms %children 97.75% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.080ms self 0.080ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.118ms self 0.118ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.201ms self 0.201ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.219ms self 0.219ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.435ms self 0.006ms children 0.429ms %children 98.59% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.408ms self 0.408ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.811ms self 0.006ms children 0.805ms %children 99.23% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.786ms self 0.015ms children 0.771ms %children 98.06% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.771ms self 0.017ms children 0.754ms %children 97.80% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.128ms self 0.128ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.067ms self 0.067ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.200ms self 0.200ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.226ms self 0.226ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 0.388ms self 0.388ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.142 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.870ms self 0.007ms children 0.863ms %children 99.21% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.843ms self 0.016ms children 0.827ms %children 98.14% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.827ms self 0.018ms children 0.809ms %children 97.78% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.111ms self 0.111ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.139ms self 0.139ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.208ms self 0.208ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.217ms self 0.217ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.035ms self 0.005ms children 0.030ms %children 84.40% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 1.635ms self 0.006ms children 1.628ms %children 99.62% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 1.609ms self 0.015ms children 1.593ms %children 99.04% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 1.593ms self 0.021ms children 1.573ms %children 98.71% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.845ms self 0.845ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.177ms self 0.177ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.076ms self 0.076ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.206ms self 0.206ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.211ms self 0.211ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.410ms self 0.007ms children 0.403ms %children 98.41% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.384ms self 0.384ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.778ms self 0.006ms children 0.772ms %children 99.23% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.753ms self 0.016ms children 0.737ms %children 97.88% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.737ms self 0.017ms children 0.720ms %children 97.67% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.115ms self 0.115ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.192ms self 0.192ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.210ms self 0.210ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 0.390ms self 0.390ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.830ms self 0.006ms children 0.824ms %children 99.30% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.803ms self 0.017ms children 0.787ms %children 97.91% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.787ms self 0.018ms children 0.768ms %children 97.66% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.143 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.092ms self 0.092ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.197ms self 0.197ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.214ms self 0.214ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.035ms self 0.005ms children 0.030ms %children 85.93% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.789ms self 0.006ms children 0.783ms %children 99.22% 2023-04-22 21:13:43.144 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.764ms self 0.016ms children 0.747ms %children 97.85% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.747ms self 0.019ms children 0.729ms %children 97.52% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.119ms self 0.119ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.193ms self 0.193ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.222ms self 0.222ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.403ms self 0.006ms children 0.396ms %children 98.48% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.377ms self 0.377ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.761ms self 0.006ms children 0.755ms %children 99.22% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.736ms self 0.016ms children 0.721ms %children 97.85% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.721ms self 0.017ms children 0.703ms %children 97.60% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.069ms self 0.069ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.113ms self 0.113ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.189ms self 0.189ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.213ms self 0.213ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 0.381ms self 0.381ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/InitializeCompiledFunction total 2.130ms self 2.130ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/RunCompiledFunction total 50.570ms self 50.570ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.184ms self 0.005ms children 0.179ms %children 97.21% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.175ms self 0.010ms children 0.165ms %children 94.35% 2023-04-22 21:13:43.166 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.165ms self 0.012ms children 0.153ms %children 92.66% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR total 0.142ms self 0.004ms children 0.138ms %children 97.16% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation total 0.134ms self 0.008ms children 0.127ms %children 94.25% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 0.127ms self 0.009ms children 0.118ms %children 93.01% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable total 0.034ms self 0.004ms children 0.031ms %children 89.64% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable/LoweringTransformation total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerMatrixToTable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable total 0.115ms self 0.003ms children 0.112ms %children 96.98% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 0.109ms self 0.006ms children 0.103ms %children 94.16% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 0.103ms self 0.009ms children 0.094ms %children 91.64% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets total 0.027ms self 0.003ms children 0.024ms %children 88.58% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets total 0.016ms self 0.002ms children 0.013ms %children 84.38% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets/LoweringTransformation total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/EvalRelationalLets/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles total 0.015ms self 0.002ms children 0.013ms %children 85.35% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.167 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.112ms self 0.003ms children 0.109ms %children 97.50% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.107ms self 0.006ms children 0.100ms %children 94.01% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.100ms self 0.008ms children 0.092ms %children 91.61% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable total 0.024ms self 0.003ms children 0.021ms %children 87.11% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.104ms self 0.003ms children 0.101ms %children 97.34% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.099ms self 0.006ms children 0.093ms %children 93.67% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.093ms self 0.008ms children 0.085ms %children 91.39% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile total 95.279ms self 90.824ms children 4.455ms %children 4.68% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.335ms self 0.005ms children 0.330ms %children 98.44% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.319ms self 0.013ms children 0.306ms %children 95.80% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.306ms self 0.014ms children 0.291ms %children 95.33% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.044ms self 0.044ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.079ms self 0.079ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR total 0.024ms self 0.004ms children 0.020ms %children 81.58% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.285ms self 0.005ms children 0.281ms %children 98.42% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.168 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.271ms self 0.012ms children 0.259ms %children 95.57% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.259ms self 0.013ms children 0.246ms %children 94.86% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.056ms self 0.056ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.283ms self 0.005ms children 0.277ms %children 98.08% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.266ms self 0.266ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.299ms self 0.005ms children 0.294ms %children 98.26% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.284ms self 0.012ms children 0.272ms %children 95.75% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.272ms self 0.013ms children 0.259ms %children 95.30% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/EmitContext.analyze total 0.247ms self 0.247ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.338ms self 0.005ms children 0.332ms %children 98.40% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.321ms self 0.013ms children 0.308ms %children 96.07% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.308ms self 0.014ms children 0.294ms %children 95.30% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.067ms self 0.067ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR total 0.023ms self 0.005ms children 0.019ms %children 80.53% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.355ms self 0.004ms children 0.351ms %children 98.79% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.342ms self 0.012ms children 0.330ms %children 96.54% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.330ms self 0.014ms children 0.316ms %children 95.82% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.169 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.124ms self 0.124ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.260ms self 0.005ms children 0.255ms %children 97.99% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.243ms self 0.243ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.295ms self 0.005ms children 0.290ms %children 98.28% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.281ms self 0.013ms children 0.268ms %children 95.41% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.268ms self 0.014ms children 0.254ms %children 94.92% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/EmitContext.analyze total 0.233ms self 0.233ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 0.361ms self 0.006ms children 0.355ms %children 98.44% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.343ms self 0.013ms children 0.330ms %children 96.20% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.330ms self 0.014ms children 0.317ms %children 95.80% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.045ms self 0.045ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.077ms self 0.077ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR total 0.024ms self 0.006ms children 0.018ms %children 73.27% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 0.287ms self 0.005ms children 0.282ms %children 98.37% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.272ms self 0.012ms children 0.260ms %children 95.58% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.260ms self 0.016ms children 0.245ms %children 93.98% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 0.274ms self 0.005ms children 0.269ms %children 98.13% 2023-04-22 21:13:43.170 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.258ms self 0.258ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.300ms self 0.005ms children 0.294ms %children 98.25% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.284ms self 0.013ms children 0.272ms %children 95.46% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.272ms self 0.021ms children 0.250ms %children 92.20% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.075ms self 0.075ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/Compile/EmitContext.analyze total 0.232ms self 0.232ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/InitializeCompiledFunction total 21.791ms self 21.791ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/RunCompiledFunction total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles total 0.187ms self 0.005ms children 0.181ms %children 97.09% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.053ms self 0.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.086ms self 0.086ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.042ms self 0.042ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 45.511ms self 0.006ms children 45.505ms %children 99.99% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 45.401ms self 0.014ms children 45.387ms %children 99.97% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 45.387ms self 0.053ms children 45.334ms %children 99.88% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.254ms self 0.254ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.498ms self 0.498ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.625ms self 0.625ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.840ms self 0.840ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 1.880ms self 1.880ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.183ms self 0.183ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 28.407ms self 28.407ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.270ms self 0.270ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.269ms self 0.269ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.515ms self 0.515ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.794ms self 0.794ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 1.556ms self 1.556ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.180ms self 0.180ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 1.944ms self 1.944ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.262ms self 0.262ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.270ms self 0.270ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.494ms self 0.494ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.755ms self 0.755ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 3.190ms self 3.190ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.196ms self 0.196ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 1.953ms self 1.953ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.080ms self 0.080ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable total 2.469s self 0.011ms children 2.469s %children 100.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 2.469s self 107.859ms children 2.361s %children 95.63% 2023-04-22 21:13:43.171 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR total 120.179ms self 0.008ms children 120.171ms %children 99.99% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation total 119.951ms self 0.027ms children 119.924ms %children 99.98% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 119.924ms self 0.059ms children 119.865ms %children 99.95% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.868ms self 0.868ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.907ms self 0.907ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 19.516ms self 19.516ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 5.361ms self 5.361ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 6.344ms self 6.344ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.979ms self 0.979ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 45.418ms self 45.418ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 1.111ms self 1.111ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.527ms self 0.527ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.127ms self 1.127ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.589ms self 1.589ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 5.985ms self 5.985ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.934ms self 0.934ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 3.926ms self 3.926ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.748ms self 0.748ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.488ms self 0.488ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.032ms self 1.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.453ms self 1.453ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 15.775ms self 15.775ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 1.091ms self 1.091ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 4.687ms self 4.687ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.157ms self 0.157ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable total 9.429ms self 0.010ms children 9.418ms %children 99.89% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/LoweringTransformation total 9.295ms self 9.295ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable total 104.996ms self 0.008ms children 104.988ms %children 99.99% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.058ms self 0.058ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 104.775ms self 0.029ms children 104.745ms %children 99.97% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 104.745ms self 0.053ms children 104.692ms %children 99.95% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.468ms self 0.468ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.476ms self 0.476ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.035ms self 1.035ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 1.451ms self 1.451ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 32.186ms self 32.186ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 2.826ms self 2.826ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 3.692ms self 3.692ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.662ms self 0.662ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.486ms self 0.486ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.019ms self 1.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 1.431ms self 1.431ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 18.196ms self 18.196ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 3.266ms self 3.266ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 3.721ms self 3.721ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.638ms self 0.638ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.482ms self 0.482ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.013ms self 1.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.172 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 1.433ms self 1.433ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 25.871ms self 25.871ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.673ms self 0.673ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 3.664ms self 3.664ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.155ms self 0.155ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets total 1.872ms self 0.007ms children 1.865ms %children 99.63% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.067ms self 0.067ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/LoweringTransformation total 1.739ms self 1.739ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets total 0.370ms self 0.005ms children 0.365ms %children 98.70% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/LoweringTransformation total 0.234ms self 0.234ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles total 0.475ms self 0.005ms children 0.470ms %children 99.02% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 0.306ms self 0.306ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.093ms self 0.093ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 64.578ms self 0.007ms children 64.571ms %children 99.99% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.047ms self 0.047ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 64.368ms self 0.037ms children 64.331ms %children 99.94% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 64.331ms self 0.052ms children 64.279ms %children 99.92% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.386ms self 0.386ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.471ms self 0.471ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.110ms self 1.110ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 1.468ms self 1.468ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 10.497ms self 10.497ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.665ms self 0.665ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 3.549ms self 3.549ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.616ms self 0.616ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.481ms self 0.481ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.009ms self 1.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 1.401ms self 1.401ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 13.011ms self 13.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 13.643ms self 13.643ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 3.599ms self 3.599ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.675ms self 0.675ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.482ms self 0.482ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.033ms self 1.033ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 1.350ms self 1.350ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 4.760ms self 4.760ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.628ms self 0.628ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 3.446ms self 3.446ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.155ms self 0.155ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable total 14.268ms self 0.007ms children 14.261ms %children 99.95% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 14.097ms self 14.097ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 87.490ms self 0.008ms children 87.482ms %children 99.99% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.048ms self 0.048ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.173 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 87.279ms self 0.025ms children 87.253ms %children 99.97% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 87.253ms self 0.049ms children 87.204ms %children 99.94% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.371ms self 0.371ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.487ms self 0.487ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 1.010ms self 1.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 1.345ms self 1.345ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 22.840ms self 22.840ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.637ms self 0.637ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 3.450ms self 3.450ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.639ms self 0.639ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.474ms self 0.474ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 1.007ms self 1.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 25.357ms self 25.357ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 3.690ms self 3.690ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.611ms self 0.611ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 3.416ms self 3.416ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.598ms self 0.598ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.473ms self 0.473ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 6.606ms self 6.606ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 1.353ms self 1.353ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 8.754ms self 8.754ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.632ms self 0.632ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 3.455ms self 3.455ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.156ms self 0.156ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile total 1.693s self 898.965ms children 793.572ms %children 46.89% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 89.933ms self 0.008ms children 89.925ms %children 99.99% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.050ms self 0.050ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 89.716ms self 0.031ms children 89.684ms %children 99.97% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 89.684ms self 0.058ms children 89.626ms %children 99.94% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.496ms self 0.496ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.495ms self 0.495ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.046ms self 1.046ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.309ms self 1.309ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 19.729ms self 19.729ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.497ms self 0.497ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 3.481ms self 3.481ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.614ms self 0.614ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.472ms self 0.472ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.009ms self 1.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 20.336ms self 20.336ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 3.365ms self 3.365ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.468ms self 0.468ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 3.415ms self 3.415ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 4.850ms self 4.850ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.489ms self 0.489ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.041ms self 1.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.317ms self 1.317ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 3.241ms self 3.241ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.467ms self 0.467ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 21.488ms self 21.488ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.174 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.159ms self 0.159ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.656ms self 0.006ms children 0.651ms %children 99.13% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.102ms self 0.102ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.378ms self 0.378ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.170ms self 0.170ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 109.369ms self 0.007ms children 109.362ms %children 99.99% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 109.099ms self 0.029ms children 109.070ms %children 99.97% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 109.070ms self 0.058ms children 109.012ms %children 99.95% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.502ms self 0.502ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.565ms self 0.565ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 13.547ms self 13.547ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.175 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 2.001ms self 2.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 4.582ms self 4.582ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.720ms self 0.720ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 48.561ms self 48.561ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.672ms self 0.672ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.589ms self 0.589ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.619ms self 1.619ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 1.278ms self 1.278ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 4.634ms self 4.634ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.684ms self 0.684ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 3.392ms self 3.392ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.666ms self 0.666ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.585ms self 0.585ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.281ms self 1.281ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 1.471ms self 1.471ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 4.624ms self 4.624ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.682ms self 0.682ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 16.357ms self 16.357ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.196ms self 0.196ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 19.376ms self 0.007ms children 19.369ms %children 99.96% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.154ms self 0.154ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 19.076ms self 19.076ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.140ms self 0.140ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 113.202ms self 0.007ms children 113.195ms %children 99.99% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.058ms self 0.058ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 112.955ms self 0.030ms children 112.925ms %children 99.97% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 112.925ms self 0.057ms children 112.867ms %children 99.95% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.377ms self 0.377ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.571ms self 0.571ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 27.399ms self 27.399ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.295ms self 1.295ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 4.053ms self 4.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 18.978ms self 18.978ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 3.431ms self 3.431ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.184 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.649ms self 0.649ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.569ms self 0.569ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.296ms self 1.296ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.293ms self 1.293ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 10.204ms self 10.204ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.559ms self 0.559ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 3.332ms self 3.332ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.631ms self 0.631ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.575ms self 0.575ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 14.383ms self 14.383ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.290ms self 1.290ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 18.026ms self 18.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.564ms self 0.564ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 3.391ms self 3.391ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.182ms self 0.182ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 5.074ms self 5.074ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 72.344ms self 0.015ms children 72.330ms %children 99.98% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 72.166ms self 0.037ms children 72.129ms %children 99.95% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 72.129ms self 0.085ms children 72.044ms %children 99.88% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.421ms self 0.421ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.552ms self 0.552ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.686ms self 1.686ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.977ms self 1.977ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 3.021ms self 3.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.205ms self 0.205ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 3.344ms self 3.344ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.507ms self 0.507ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 23.522ms self 23.522ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.978ms self 0.978ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.247ms self 1.247ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 2.930ms self 2.930ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.204ms self 0.204ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 3.135ms self 3.135ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 3.189ms self 3.189ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.499ms self 0.499ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.019ms self 1.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.252ms self 1.252ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 2.844ms self 2.844ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.196ms self 0.196ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 19.316ms self 19.316ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.097ms self 0.097ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.598ms self 0.007ms children 0.590ms %children 98.76% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.109ms self 0.109ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.333ms self 0.333ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.148ms self 0.148ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 73.905ms self 0.008ms children 73.898ms %children 99.99% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 73.718ms self 0.029ms children 73.689ms %children 99.96% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 73.689ms self 0.057ms children 73.632ms %children 99.92% 2023-04-22 21:13:43.185 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.377ms self 0.377ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 3.178ms self 3.178ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.200ms self 1.200ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 1.818ms self 1.818ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 3.586ms self 3.586ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.377ms self 0.377ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 13.308ms self 13.308ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.546ms self 0.546ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.568ms self 0.568ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.185ms self 1.185ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 1.183ms self 1.183ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 17.503ms self 17.503ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.413ms self 0.413ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 3.162ms self 3.162ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.569ms self 0.569ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.565ms self 0.565ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.170ms self 1.170ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 1.191ms self 1.191ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 3.368ms self 3.368ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 15.189ms self 15.189ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 3.176ms self 3.176ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.118ms self 0.118ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 19.463ms self 0.007ms children 19.456ms %children 99.96% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.153ms self 0.153ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 19.166ms self 19.166ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 49.210ms self 0.007ms children 49.203ms %children 99.98% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 49.026ms self 0.046ms children 48.980ms %children 99.91% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 48.980ms self 0.054ms children 48.926ms %children 99.89% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.362ms self 0.362ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.560ms self 0.560ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.131ms self 1.131ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.193ms self 1.193ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 11.467ms self 11.467ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.382ms self 0.382ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 3.151ms self 3.151ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.517ms self 0.517ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.553ms self 0.553ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.103ms self 1.103ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.199ms self 1.199ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 4.217ms self 4.217ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.368ms self 0.368ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 2.995ms self 2.995ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.506ms self 0.506ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.561ms self 0.561ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 10.596ms self 10.596ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.202ms self 1.202ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 3.441ms self 3.441ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.405ms self 0.405ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 3.019ms self 3.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.186 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.110ms self 0.110ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 20.091ms self 20.091ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 54.219ms self 0.013ms children 54.206ms %children 99.98% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 54.055ms self 0.031ms children 54.024ms %children 99.94% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 54.024ms self 0.057ms children 53.967ms %children 99.89% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.339ms self 0.339ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.522ms self 0.522ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.889ms self 0.889ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.278ms self 1.278ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 18.844ms self 18.844ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.193ms self 0.193ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 3.285ms self 3.285ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.464ms self 0.464ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.467ms self 0.467ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.816ms self 0.816ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.186ms self 1.186ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 10.549ms self 10.549ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.198ms self 0.198ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 3.069ms self 3.069ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.459ms self 0.459ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.471ms self 0.471ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.808ms self 0.808ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 3.860ms self 3.860ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 3.019ms self 3.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.220ms self 0.220ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 3.029ms self 3.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.090ms self 0.090ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.576ms self 0.007ms children 0.569ms %children 98.82% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.106ms self 0.106ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.311ms self 0.311ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.151ms self 0.151ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 64.632ms self 0.007ms children 64.625ms %children 99.99% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 64.449ms self 0.027ms children 64.422ms %children 99.96% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 64.422ms self 0.055ms children 64.367ms %children 99.91% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.383ms self 0.383ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.555ms self 0.555ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.028ms self 1.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 2.887ms self 2.887ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 3.065ms self 3.065ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.359ms self 0.359ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 4.512ms self 4.512ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.518ms self 0.518ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.553ms self 0.553ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.029ms self 1.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 1.183ms self 1.183ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.187 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 13.598ms self 13.598ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.375ms self 0.375ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 18.510ms self 18.510ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 1.735ms self 1.735ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.564ms self 0.564ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 3.532ms self 3.532ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 1.240ms self 1.240ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 4.041ms self 4.041ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.373ms self 0.373ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 4.327ms self 4.327ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.111ms self 0.111ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 8.229ms self 0.007ms children 8.222ms %children 99.91% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.151ms self 0.151ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 7.934ms self 7.934ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 77.993ms self 0.007ms children 77.986ms %children 99.99% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.057ms self 0.057ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 77.799ms self 0.025ms children 77.774ms %children 99.97% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 77.774ms self 0.053ms children 77.721ms %children 99.93% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.334ms self 0.334ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 27.576ms self 27.576ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.049ms self 1.049ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.173ms self 1.173ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 2.858ms self 2.858ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.341ms self 0.341ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 5.588ms self 5.588ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.487ms self 0.487ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.556ms self 0.556ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.015ms self 1.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.176ms self 1.176ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 2.838ms self 2.838ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.347ms self 0.347ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 19.093ms self 19.093ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.507ms self 0.507ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.561ms self 0.561ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.014ms self 1.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 3.239ms self 3.239ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 4.708ms self 4.708ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.227ms self 0.227ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 3.032ms self 3.032ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.130ms self 0.130ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 14.701ms self 14.701ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/InitializeCompiledFunction total 20.923ms self 20.923ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation/RunCompiledFunction total 244.162ms self 244.162ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.266ms self 0.009ms children 0.258ms %children 96.78% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.252ms self 0.023ms children 0.229ms %children 90.91% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.229ms self 0.013ms children 0.215ms %children 94.16% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.188 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.069ms self 0.069ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles total 0.030ms self 0.005ms children 0.025ms %children 83.20% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/LoweringTransformation total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.303ms self 0.005ms children 0.298ms %children 98.42% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.293ms self 0.010ms children 0.284ms %children 96.69% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.284ms self 0.018ms children 0.266ms %children 93.66% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.087ms self 0.087ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.046ms self 0.046ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable total 0.075ms self 0.004ms children 0.070ms %children 94.01% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation total 0.053ms self 0.053ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.114ms self 0.003ms children 0.111ms %children 97.22% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.108ms self 0.007ms children 0.101ms %children 93.62% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.101ms self 0.008ms children 0.093ms %children 91.74% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/Compile total 24.585ms self 22.834ms children 1.751ms %children 7.12% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.203ms self 0.005ms children 0.198ms %children 97.77% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.193ms self 0.007ms children 0.186ms %children 96.40% 2023-04-22 21:13:43.189 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.186ms self 0.013ms children 0.173ms %children 92.81% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.050ms self 0.050ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.017ms self 0.004ms children 0.014ms %children 77.88% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.085ms self 0.003ms children 0.083ms %children 96.87% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.081ms self 0.006ms children 0.074ms %children 92.26% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.074ms self 0.007ms children 0.067ms %children 90.05% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.155ms self 0.003ms children 0.152ms %children 97.99% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.142ms self 0.142ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.102ms self 0.003ms children 0.099ms %children 96.70% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.096ms self 0.007ms children 0.090ms %children 93.22% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.090ms self 0.008ms children 0.082ms %children 90.95% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.084ms self 0.084ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.223ms self 0.006ms children 0.217ms %children 97.52% 2023-04-22 21:13:43.190 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.212ms self 0.008ms children 0.204ms %children 96.34% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.204ms self 0.014ms children 0.190ms %children 93.14% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.045ms self 0.045ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.015ms self 0.004ms children 0.011ms %children 75.42% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.088ms self 0.003ms children 0.085ms %children 96.74% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.083ms self 0.006ms children 0.077ms %children 92.47% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.077ms self 0.008ms children 0.069ms %children 89.88% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.080ms self 0.003ms children 0.077ms %children 95.87% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.069ms self 0.069ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.094ms self 0.003ms children 0.091ms %children 96.85% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.089ms self 0.006ms children 0.083ms %children 92.95% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.083ms self 0.008ms children 0.075ms %children 90.46% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.068ms self 0.068ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.191 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR total 0.208ms self 0.005ms children 0.203ms %children 97.62% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.198ms self 0.008ms children 0.191ms %children 96.21% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.191ms self 0.014ms children 0.177ms %children 92.80% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.043ms self 0.043ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR total 0.015ms self 0.004ms children 0.011ms %children 74.51% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/LoweringTransformation total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR total 0.088ms self 0.003ms children 0.085ms %children 96.79% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.083ms self 0.006ms children 0.076ms %children 92.34% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.076ms self 0.007ms children 0.069ms %children 90.47% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs total 0.074ms self 0.003ms children 0.071ms %children 95.89% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.093ms self 0.003ms children 0.090ms %children 96.65% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.088ms self 0.006ms children 0.082ms %children 92.96% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.082ms self 0.008ms children 0.074ms %children 90.51% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.192 : INFO: timing SparkBackend.executeEncode/Compile/EmitContext.analyze total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.193 : INFO: timing SparkBackend.executeEncode/InitializeCompiledFunction total 1.302ms self 1.302ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.193 : INFO: timing SparkBackend.executeEncode/RunCompiledFunction total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.250 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:43.251 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:43.251 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:13:43.251 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:43.251 : INFO: RegionPool: FREE: 64.0K allocated (64.0K blocks / 0 chunks), regions.size = 1, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:43.251 : INFO: timing SparkBackend.parse_value_ir total 0.733ms self 0.733ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.251 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:43.252 : INFO: starting execution of query hail_query_6 of initial size 5 2023-04-22 21:13:43.252 : INFO: initial IR: IR size 5: (Let __rng_state (RNGStateLiteral) (MakeTuple (0) (MakeStruct (__gt (NA Int32))))) 2023-04-22 21:13:43.253 : INFO: after optimize: relationalLowerer, initial IR: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:13:43.253 : INFO: after LowerMatrixToTable: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:13:43.253 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:13:43.253 : INFO: after LiftRelationalValuesToRelationalLets: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:13:43.253 : INFO: after EvalRelationalLets: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:13:43.253 : INFO: after LowerAndExecuteShuffles: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:13:43.253 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:13:43.253 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:13:43.253 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (Literal Tuple[Struct{__gt:Int32}] ) 2023-04-22 21:13:43.255 : INFO: encoder cache hit 2023-04-22 21:13:43.255 : INFO: finished execution of query hail_query_6, result size is 2.00 B 2023-04-22 21:13:43.255 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:43.255 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:13:43.256 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:43.256 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode total 4.193ms self 1.564ms children 2.629ms %children 62.71% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR total 0.593ms self 0.007ms children 0.587ms %children 98.87% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation total 0.578ms self 0.012ms children 0.566ms %children 97.86% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 0.566ms self 0.017ms children 0.549ms %children 97.04% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.147ms self 0.147ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.195ms self 0.195ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable total 0.030ms self 0.005ms children 0.025ms %children 84.31% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/LoweringTransformation total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable total 0.115ms self 0.003ms children 0.112ms %children 97.10% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 0.110ms self 0.007ms children 0.102ms %children 93.26% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 0.102ms self 0.008ms children 0.094ms %children 92.09% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets total 0.055ms self 0.003ms children 0.052ms %children 94.06% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.047ms self 0.047ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets total 0.016ms self 0.003ms children 0.013ms %children 83.44% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.256 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles total 0.017ms self 0.004ms children 0.013ms %children 77.46% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/LoweringTransformation total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.110ms self 0.008ms children 0.101ms %children 92.34% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.098ms self 0.008ms children 0.090ms %children 91.67% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.090ms self 0.009ms children 0.081ms %children 90.14% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable total 0.020ms self 0.003ms children 0.017ms %children 84.98% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.091ms self 0.003ms children 0.088ms %children 97.02% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.086ms self 0.006ms children 0.080ms %children 93.22% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.080ms self 0.008ms children 0.073ms %children 90.45% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/Compile total 0.236ms self 0.236ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/InitializeCompiledFunction total 1.345ms self 1.345ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.257 : INFO: timing SparkBackend.executeEncode/RunCompiledFunction total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.334 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:43.355 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:43.356 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:13:43.356 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:43.356 : INFO: RegionPool: FREE: 64.0K allocated (64.0K blocks / 0 chunks), regions.size = 1, 0 current java objects, thread 14: Thread-5 2023-04-22 21:13:43.356 : INFO: timing SparkBackend.parse_value_ir total 21.790ms self 21.790ms children 0.000ms %children 0.00% 2023-04-22 21:13:43.356 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:13:43.357 : INFO: starting execution of query hail_query_7 of initial size 49 2023-04-22 21:13:43.359 : INFO: initial IR: IR size 49: (Let __rng_state (RNGStateLiteral) (MatrixWrite "{\"name\":\"MatrixBlockMatrixWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false,\"entryField\":\"__uid_6\",\"blockSize\":4096}" (MatrixMapEntries (MatrixMapRows (MatrixMapEntries (MatrixMapEntries (MatrixRead Matrix{global:Struct{},col_key:[s],col:Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean},row_key:[[locus,alleles]],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64},entry:Struct{GT:Call}} False False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (InsertFields (SelectFields () (SelectFields (GT) (Ref g))) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref g)))))) (If (ApplyUnaryPrimOp Bang (IsNA (SelectFields (__gt) (Ref g)))) (SelectFields (__gt) (Ref g)) (Literal Struct{__gt:Int32} ))) (InsertFields (SelectFields (locus alleles rsid cm_position) (Ref va)) None (__mean_gt (AggLet __uid_agg5 False (ApplyIR 14 toFloat64 () Float64 (GetField __gt (Ref g))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __uid_agg5))) (ApplyIR 16 toFloat64 () Float64 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __uid_agg5)))))))))))) (InsertFields (SelectFields () (SelectFields (__gt) (Ref g))) None (__uid_6 (Coalesce (ApplyIR 17 toFloat64 () Float64 (GetField __gt (Ref g))) (GetField __mean_gt (Ref va)))))))) 2023-04-22 21:13:43.405 : INFO: after optimize: relationalLowerer, initial IR: IR size 43: (MatrixWrite "{\"name\":\"MatrixBlockMatrixWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false,\"entryField\":\"__uid_6\",\"blockSize\":4096}" (MatrixMapEntries (MatrixMapRows (MatrixMapEntries (MatrixRead Matrix{global:Struct{},col_key:[s],col:Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean},row_key:[[locus,alleles]],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64},entry:Struct{GT:Call}} False False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __iruid_2585 (InsertFields (SelectFields () (Ref g)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref g))))) (If (IsNA (Ref __iruid_2585)) (Literal Struct{__gt:Int32} ) (Ref __iruid_2585)))) (InsertFields (Ref va) None (__mean_gt (AggLet __iruid_2586 False (Cast Float64 (GetField __gt (Ref g))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_2586))) (Let __iruid_2587 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_2586)))))) (Cast Float64 (Ref __iruid_2587)))))))) (InsertFields (SelectFields () (Ref g)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref g))) (GetField __mean_gt (Ref va))))))) 2023-04-22 21:13:43.444 : INFO: after LowerMatrixToTable: IR size 102: (TableWrite "{\"writer\":{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false,\"entryField\":\"__uid_6\",\"blockSize\":4096},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableMapRows (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (InsertFields (Ref row) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamZip -1 AssumeSameLength (g sa) (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (ToStream False (GetField __cols (Ref global))) (Let __iruid_2585 (InsertFields (SelectFields () (Ref g)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref g))))) (If (IsNA (Ref __iruid_2585)) (Literal Struct{__gt:Int32} ) (Ref __iruid_2585)))))))) (Let n_cols (ArrayLen (GetField __cols (Ref global))) (InsertFields (Let __iruid_2588 (MakeStruct) (StreamAgg i (ToStream False (ToArray (StreamFilter i (ToStream False (ToArray (StreamRange -1 False (I32 0) (ArrayLen (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (I32 1)))) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row)) (Ref i))))))) (AggLet sa False (ArrayRef -1 (GetField __cols (Ref global)) (Ref i)) (AggLet g False (ArrayRef -1 (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row)) (Ref i)) (InsertFields (SelectFields (locus alleles rsid cm_position) (Ref row)) None (__mean_gt (AggLet __iruid_2586 False (Cast Float64 (GetField __gt (Ref g))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_2586))) (Let __iruid_2587 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_2586)))))) (Cast Float64 (Ref __iruid_2587))))))))))) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row)))))) (InsertFields (Ref row) None (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamZip -1 AssumeSameLength (g sa) (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (ToStream False (GetField __cols (Ref global))) (InsertFields (SelectFields () (Ref g)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref g))) (GetField __mean_gt (SelectFields (locus alleles rsid cm_position __mean_gt) (Ref row)))))))))))) 2023-04-22 21:13:43.490 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 69: (TableWrite "{\"writer\":{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false,\"entryField\":\"__uid_6\",\"blockSize\":4096},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __iruid_2632 (ToArray (StreamMap __iruid_2633 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_2634 (InsertFields (SelectFields () (Ref __iruid_2633)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_2633))))) (If (IsNA (Ref __iruid_2634)) (Literal Struct{__gt:Int32} ) (Ref __iruid_2634))))) (Let __iruid_2635 (StreamAgg __iruid_2636 (StreamFilter __iruid_2637 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2632)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_2632) (Ref __iruid_2637))))) (InsertFields (SelectFields (locus alleles rsid cm_position) (Ref row)) None (__mean_gt (AggLet __iruid_2638 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_2632) (Ref __iruid_2636)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_2638))) (Let __iruid_2639 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_2638)))))) (Cast Float64 (Ref __iruid_2639)))))))) (InsertFields (Ref __iruid_2635) ("locus" "alleles" "rsid" "cm_position" "__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_2640 (ToStream False (Ref __iruid_2632)) (InsertFields (SelectFields () (Ref __iruid_2640)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_2640))) (GetField __mean_gt (Ref __iruid_2635))))))))))))) 2023-04-22 21:13:43.493 : INFO: after LiftRelationalValuesToRelationalLets: IR size 69: (TableWrite "{\"writer\":{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false,\"entryField\":\"__uid_6\",\"blockSize\":4096},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __iruid_2632 (ToArray (StreamMap __iruid_2633 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_2634 (InsertFields (SelectFields () (Ref __iruid_2633)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_2633))))) (If (IsNA (Ref __iruid_2634)) (Literal Struct{__gt:Int32} ) (Ref __iruid_2634))))) (Let __iruid_2635 (StreamAgg __iruid_2636 (StreamFilter __iruid_2637 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2632)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_2632) (Ref __iruid_2637))))) (InsertFields (SelectFields (locus alleles rsid cm_position) (Ref row)) None (__mean_gt (AggLet __iruid_2638 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_2632) (Ref __iruid_2636)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_2638))) (Let __iruid_2639 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_2638)))))) (Cast Float64 (Ref __iruid_2639)))))))) (InsertFields (Ref __iruid_2635) ("locus" "alleles" "rsid" "cm_position" "__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_2640 (ToStream False (Ref __iruid_2632)) (InsertFields (SelectFields () (Ref __iruid_2640)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_2640))) (GetField __mean_gt (Ref __iruid_2635))))))))))))) 2023-04-22 21:13:43.496 : INFO: after EvalRelationalLets: IR size 69: (TableWrite "{\"writer\":{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false,\"entryField\":\"__uid_6\",\"blockSize\":4096},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __iruid_2632 (ToArray (StreamMap __iruid_2633 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_2634 (InsertFields (SelectFields () (Ref __iruid_2633)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_2633))))) (If (IsNA (Ref __iruid_2634)) (Literal Struct{__gt:Int32} ) (Ref __iruid_2634))))) (Let __iruid_2635 (StreamAgg __iruid_2636 (StreamFilter __iruid_2637 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2632)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_2632) (Ref __iruid_2637))))) (InsertFields (SelectFields (locus alleles rsid cm_position) (Ref row)) None (__mean_gt (AggLet __iruid_2638 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_2632) (Ref __iruid_2636)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_2638))) (Let __iruid_2639 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_2638)))))) (Cast Float64 (Ref __iruid_2639)))))))) (InsertFields (Ref __iruid_2635) ("locus" "alleles" "rsid" "cm_position" "__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_2640 (ToStream False (Ref __iruid_2632)) (InsertFields (SelectFields () (Ref __iruid_2640)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_2640))) (GetField __mean_gt (Ref __iruid_2635))))))))))))) 2023-04-22 21:13:43.510 : INFO: after LowerAndExecuteShuffles: IR size 69: (TableWrite "{\"writer\":{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false,\"entryField\":\"__uid_6\",\"blockSize\":4096},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __iruid_2632 (ToArray (StreamMap __iruid_2633 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_2634 (InsertFields (SelectFields () (Ref __iruid_2633)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_2633))))) (If (IsNA (Ref __iruid_2634)) (Literal Struct{__gt:Int32} ) (Ref __iruid_2634))))) (Let __iruid_2635 (StreamAgg __iruid_2636 (StreamFilter __iruid_2637 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2632)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_2632) (Ref __iruid_2637))))) (InsertFields (SelectFields (locus alleles rsid cm_position) (Ref row)) None (__mean_gt (AggLet __iruid_2638 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_2632) (Ref __iruid_2636)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_2638))) (Let __iruid_2639 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_2638)))))) (Cast Float64 (Ref __iruid_2639)))))))) (InsertFields (Ref __iruid_2635) ("locus" "alleles" "rsid" "cm_position" "__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_2640 (ToStream False (Ref __iruid_2632)) (InsertFields (SelectFields () (Ref __iruid_2640)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_2640))) (GetField __mean_gt (Ref __iruid_2635))))))))))))) 2023-04-22 21:13:43.543 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 69: (TableWrite "{\"writer\":{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false,\"entryField\":\"__uid_6\",\"blockSize\":4096},\"colsFieldName\":\"__cols\",\"entriesFieldName\":\"the entries! [877f12a8827e18f61222c6c8c5fb04a8]\",\"colKey\":[\"s\"]}" (TableMapRows (TableRead Table{global:Struct{__cols:Array[Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __iruid_2659 (ToArray (StreamMap __iruid_2660 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_2661 (InsertFields (SelectFields () (Ref __iruid_2660)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_2660))))) (If (IsNA (Ref __iruid_2661)) (Literal Struct{__gt:Int32} ) (Ref __iruid_2661))))) (Let __iruid_2662 (StreamAgg __iruid_2663 (StreamFilter __iruid_2664 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2659)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_2659) (Ref __iruid_2664))))) (InsertFields (SelectFields (locus alleles rsid cm_position) (Ref row)) None (__mean_gt (AggLet __iruid_2665 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_2659) (Ref __iruid_2663)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_2665))) (Let __iruid_2666 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_2665)))))) (Cast Float64 (Ref __iruid_2666)))))))) (InsertFields (Ref __iruid_2662) ("locus" "alleles" "rsid" "cm_position" "__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_2667 (ToStream False (Ref __iruid_2659)) (InsertFields (SelectFields () (Ref __iruid_2667)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_2667))) (GetField __mean_gt (Ref __iruid_2662))))))))))))) 2023-04-22 21:13:43.559 : INFO: LowerOrInterpretNonCompilable: whole stage code generation is a go! 2023-04-22 21:13:43.559 : INFO: lowering result: TableWrite 2023-04-22 21:13:43.566 MemoryStore: INFO: Block broadcast_177 stored as values in memory (estimated size 47.3 MiB, free 25.0 GiB) 2023-04-22 21:13:44.101 MemoryStore: INFO: Block broadcast_177_piece0 stored as bytes in memory (estimated size 2.3 MiB, free 25.0 GiB) 2023-04-22 21:13:44.101 BlockManagerInfo: INFO: Added broadcast_177_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 2.3 MiB, free: 25.3 GiB) 2023-04-22 21:13:44.102 SparkContext: INFO: Created broadcast 177 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:44.127 : INFO: initial IR: IR size 5: (ArrayLen (GetField __cols (Let __iruid_2668 (Literal Struct{__cols:Array[Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean}]} ) (Ref __iruid_2668)))) 2023-04-22 21:13:44.128 : INFO: after optimize: relationalLowerer, initial IR: IR size 1: (I32 4151) 2023-04-22 21:13:44.128 : INFO: after LowerMatrixToTable: IR size 1: (I32 4151) 2023-04-22 21:13:44.128 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 1: (I32 4151) 2023-04-22 21:13:44.128 : INFO: after LiftRelationalValuesToRelationalLets: IR size 1: (I32 4151) 2023-04-22 21:13:44.128 : INFO: after EvalRelationalLets: IR size 1: (I32 4151) 2023-04-22 21:13:44.128 : INFO: after LowerAndExecuteShuffles: IR size 1: (I32 4151) 2023-04-22 21:13:44.129 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (I32 4151) 2023-04-22 21:13:44.129 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (I32 4151) 2023-04-22 21:13:44.141 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (I32 4151) 2023-04-22 21:13:44.141 : INFO: initial IR: IR size 2: (MakeTuple (0) (I32 4151)) 2023-04-22 21:13:44.141 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.141 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.142 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.142 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.142 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.144 : INFO: encoder cache miss (38 hits, 19 misses, 0.667) 2023-04-22 21:13:44.145 : INFO: instruction count: 3: __C2398HailClassLoaderContainer. 2023-04-22 21:13:44.145 : INFO: instruction count: 3: __C2398HailClassLoaderContainer. 2023-04-22 21:13:44.145 : INFO: instruction count: 3: __C2400FSContainer. 2023-04-22 21:13:44.145 : INFO: instruction count: 3: __C2400FSContainer. 2023-04-22 21:13:44.146 : INFO: instruction count: 3: __C2402etypeEncode. 2023-04-22 21:13:44.146 : INFO: instruction count: 7: __C2402etypeEncode.apply 2023-04-22 21:13:44.146 : INFO: instruction count: 9: __C2402etypeEncode.__m2404ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_int32ENDEND 2023-04-22 21:13:44.146 : INFO: instruction count: 13: __C2402etypeEncode.__m2405ENCODE_SBaseStructPointer_TO_r_struct_of_r_int32END 2023-04-22 21:13:44.146 : INFO: instruction count: 4: __C2402etypeEncode.__m2406ENCODE_SInt32$_TO_r_int32 2023-04-22 21:13:44.148 MemoryStore: INFO: Block broadcast_178 stored as values in memory (estimated size 104.0 B, free 25.0 GiB) 2023-04-22 21:13:44.162 MemoryStore: INFO: Block broadcast_178_piece0 stored as bytes in memory (estimated size 61.0 B, free 25.0 GiB) 2023-04-22 21:13:44.162 BlockManagerInfo: INFO: Added broadcast_178_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 61.0 B, free: 25.3 GiB) 2023-04-22 21:13:44.163 SparkContext: INFO: Created broadcast 178 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:44.163 : INFO: instruction count: 3: __C2385HailClassLoaderContainer. 2023-04-22 21:13:44.163 : INFO: instruction count: 3: __C2385HailClassLoaderContainer. 2023-04-22 21:13:44.163 : INFO: instruction count: 3: __C2387FSContainer. 2023-04-22 21:13:44.163 : INFO: instruction count: 3: __C2387FSContainer. 2023-04-22 21:13:44.164 : INFO: instruction count: 3: __C2389Compiled. 2023-04-22 21:13:44.164 : INFO: instruction count: 7: __C2389Compiled.apply 2023-04-22 21:13:44.164 : INFO: instruction count: 9: __C2389Compiled.setPartitionIndex 2023-04-22 21:13:44.164 : INFO: instruction count: 4: __C2389Compiled.addPartitionRegion 2023-04-22 21:13:44.164 : INFO: instruction count: 4: __C2389Compiled.setPool 2023-04-22 21:13:44.164 : INFO: instruction count: 3: __C2389Compiled.addHailClassLoader 2023-04-22 21:13:44.164 : INFO: instruction count: 3: __C2389Compiled.addFS 2023-04-22 21:13:44.164 : INFO: instruction count: 4: __C2389Compiled.addTaskContext 2023-04-22 21:13:44.164 : INFO: instruction count: 41: __C2389Compiled.addAndDecodeLiterals 2023-04-22 21:13:44.164 : INFO: instruction count: 27: __C2389Compiled.__m2395DECODE_r_struct_of_r_struct_of_r_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.164 : INFO: instruction count: 17: __C2389Compiled.__m2396INPLACE_DECODE_r_struct_of_r_int32END_TO_r_tuple_of_r_int32END 2023-04-22 21:13:44.164 : INFO: instruction count: 10: __C2389Compiled.__m2397INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:13:44.165 : INFO: initial IR: IR size 2: (MakeTuple (0) (I32 4151)) 2023-04-22 21:13:44.165 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.165 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.165 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.165 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.165 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.167 : INFO: encoder cache hit 2023-04-22 21:13:44.167 MemoryStore: INFO: Block broadcast_179 stored as values in memory (estimated size 104.0 B, free 25.0 GiB) 2023-04-22 21:13:44.168 MemoryStore: INFO: Block broadcast_179_piece0 stored as bytes in memory (estimated size 61.0 B, free 25.0 GiB) 2023-04-22 21:13:44.170 BlockManagerInfo: INFO: Added broadcast_179_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 61.0 B, free: 25.3 GiB) 2023-04-22 21:13:44.170 SparkContext: INFO: Created broadcast 179 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:44.170 : INFO: instruction count: 3: __C2407HailClassLoaderContainer. 2023-04-22 21:13:44.170 : INFO: instruction count: 3: __C2407HailClassLoaderContainer. 2023-04-22 21:13:44.171 : INFO: instruction count: 3: __C2409FSContainer. 2023-04-22 21:13:44.171 : INFO: instruction count: 3: __C2409FSContainer. 2023-04-22 21:13:44.171 : INFO: instruction count: 3: __C2411Compiled. 2023-04-22 21:13:44.171 : INFO: instruction count: 7: __C2411Compiled.apply 2023-04-22 21:13:44.171 : INFO: instruction count: 9: __C2411Compiled.setPartitionIndex 2023-04-22 21:13:44.171 : INFO: instruction count: 4: __C2411Compiled.addPartitionRegion 2023-04-22 21:13:44.171 : INFO: instruction count: 4: __C2411Compiled.setPool 2023-04-22 21:13:44.171 : INFO: instruction count: 3: __C2411Compiled.addHailClassLoader 2023-04-22 21:13:44.171 : INFO: instruction count: 3: __C2411Compiled.addFS 2023-04-22 21:13:44.172 : INFO: instruction count: 4: __C2411Compiled.addTaskContext 2023-04-22 21:13:44.172 : INFO: instruction count: 41: __C2411Compiled.addAndDecodeLiterals 2023-04-22 21:13:44.172 : INFO: instruction count: 27: __C2411Compiled.__m2417DECODE_r_struct_of_r_struct_of_r_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.172 : INFO: instruction count: 17: __C2411Compiled.__m2418INPLACE_DECODE_r_struct_of_r_int32END_TO_r_tuple_of_r_int32END 2023-04-22 21:13:44.172 : INFO: instruction count: 10: __C2411Compiled.__m2419INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:13:44.172 : INFO: initial IR: IR size 2: (MakeTuple (0) (I32 4151)) 2023-04-22 21:13:44.172 : INFO: after optimize: compileLowerer, initial IR: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.172 : INFO: after InlineApplyIR: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.172 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.172 : INFO: after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.173 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 1: (Literal Tuple[Int32] ) 2023-04-22 21:13:44.175 : INFO: encoder cache hit 2023-04-22 21:13:44.175 MemoryStore: INFO: Block broadcast_180 stored as values in memory (estimated size 104.0 B, free 25.0 GiB) 2023-04-22 21:13:44.177 MemoryStore: INFO: Block broadcast_180_piece0 stored as bytes in memory (estimated size 61.0 B, free 25.0 GiB) 2023-04-22 21:13:44.177 BlockManagerInfo: INFO: Added broadcast_180_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 61.0 B, free: 25.3 GiB) 2023-04-22 21:13:44.177 SparkContext: INFO: Created broadcast 180 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:44.177 : INFO: instruction count: 3: __C2420HailClassLoaderContainer. 2023-04-22 21:13:44.177 : INFO: instruction count: 3: __C2420HailClassLoaderContainer. 2023-04-22 21:13:44.177 : INFO: instruction count: 3: __C2422FSContainer. 2023-04-22 21:13:44.177 : INFO: instruction count: 3: __C2422FSContainer. 2023-04-22 21:13:44.178 : INFO: instruction count: 3: __C2424Compiled. 2023-04-22 21:13:44.178 : INFO: instruction count: 7: __C2424Compiled.apply 2023-04-22 21:13:44.178 : INFO: instruction count: 9: __C2424Compiled.setPartitionIndex 2023-04-22 21:13:44.178 : INFO: instruction count: 4: __C2424Compiled.addPartitionRegion 2023-04-22 21:13:44.178 : INFO: instruction count: 4: __C2424Compiled.setPool 2023-04-22 21:13:44.178 : INFO: instruction count: 3: __C2424Compiled.addHailClassLoader 2023-04-22 21:13:44.178 : INFO: instruction count: 3: __C2424Compiled.addFS 2023-04-22 21:13:44.178 : INFO: instruction count: 4: __C2424Compiled.addTaskContext 2023-04-22 21:13:44.178 : INFO: instruction count: 41: __C2424Compiled.addAndDecodeLiterals 2023-04-22 21:13:44.178 : INFO: instruction count: 27: __C2424Compiled.__m2430DECODE_r_struct_of_r_struct_of_r_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.179 : INFO: instruction count: 17: __C2424Compiled.__m2431INPLACE_DECODE_r_struct_of_r_int32END_TO_r_tuple_of_r_int32END 2023-04-22 21:13:44.179 : INFO: instruction count: 10: __C2424Compiled.__m2432INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:13:44.182 : INFO: initial IR: IR size 87: (Let __iruid_2668 (Literal Struct{__cols:Array[Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean}]} ) (Let __iruid_2673 (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2669 __iruid_2672 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (MakeStruct (__iruid_2668 (Ref __iruid_2668))) (Let __iruid_2668 (GetField __iruid_2668 (Ref __iruid_2672)) (StreamLen (Let global (Ref __iruid_2668) (StreamMap __iruid_2670 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2669)) (Let row (Ref __iruid_2670) (Let __iruid_2659 (ToArray (StreamMap __iruid_2660 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_2661 (InsertFields (SelectFields () (Ref __iruid_2660)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_2660))))) (If (IsNA (Ref __iruid_2661)) (Literal Struct{__gt:Int32} ) (Ref __iruid_2661))))) (Let __iruid_2662 (StreamAgg __iruid_2663 (StreamFilter __iruid_2664 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2659)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_2659) (Ref __iruid_2664))))) (InsertFields (SelectFields (locus alleles rsid cm_position) (Ref row)) None (__mean_gt (AggLet __iruid_2665 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_2659) (Ref __iruid_2663)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_2665))) (Let __iruid_2666 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_2665)))))) (Cast Float64 (Ref __iruid_2666)))))))) (InsertFields (Ref __iruid_2662) ("locus" "alleles" "rsid" "cm_position" "__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_2667 (ToStream False (Ref __iruid_2659)) (InsertFields (SelectFields () (Ref __iruid_2667)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_2667))) (GetField __mean_gt (Ref __iruid_2662)))))))))))))))) (NA String)) (Ref __iruid_2673))) 2023-04-22 21:13:44.198 : INFO: Prune: MakeStruct: eliminating field '__iruid_2668' 2023-04-22 21:13:44.201 : INFO: after optimize: relationalLowerer, initial IR: IR size 8: (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2682 __iruid_2683 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2682))) (NA String)) 2023-04-22 21:13:44.202 : INFO: after LowerMatrixToTable: IR size 8: (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2682 __iruid_2683 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2682))) (NA String)) 2023-04-22 21:13:44.209 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 8: (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2688 __iruid_2689 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2688))) (NA String)) 2023-04-22 21:13:44.209 : INFO: after LiftRelationalValuesToRelationalLets: IR size 8: (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2688 __iruid_2689 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2688))) (NA String)) 2023-04-22 21:13:44.210 : INFO: after EvalRelationalLets: IR size 8: (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2688 __iruid_2689 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2688))) (NA String)) 2023-04-22 21:13:44.210 : INFO: after LowerAndExecuteShuffles: IR size 8: (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2688 __iruid_2689 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2688))) (NA String)) 2023-04-22 21:13:44.225 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 8: (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2694 __iruid_2695 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2694))) (NA String)) 2023-04-22 21:13:44.226 : INFO: after LowerOrInterpretNonCompilable: IR size 8: (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2694 __iruid_2695 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2694))) (NA String)) 2023-04-22 21:13:44.228 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 8: (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2700 __iruid_2701 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2700))) (NA String)) 2023-04-22 21:13:44.228 : INFO: initial IR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2700 __iruid_2701 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2700))) (NA String))) 2023-04-22 21:13:44.231 : INFO: after optimize: compileLowerer, initial IR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2706 __iruid_2707 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2706))) (NA String))) 2023-04-22 21:13:44.231 : INFO: after InlineApplyIR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2706 __iruid_2707 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2706))) (NA String))) 2023-04-22 21:13:44.248 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2712 __iruid_2713 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2712))) (NA String))) 2023-04-22 21:13:44.249 : INFO: after LowerArrayAggsToRunAggs: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2712 __iruid_2713 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2712))) (NA String))) 2023-04-22 21:13:44.251 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2718 __iruid_2719 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2718))) (NA String))) 2023-04-22 21:13:44.257 : INFO: instruction count: 3: __C2444HailClassLoaderContainer. 2023-04-22 21:13:44.257 : INFO: instruction count: 3: __C2444HailClassLoaderContainer. 2023-04-22 21:13:44.257 : INFO: instruction count: 3: __C2446FSContainer. 2023-04-22 21:13:44.257 : INFO: instruction count: 3: __C2446FSContainer. 2023-04-22 21:13:44.274 : INFO: instruction count: 3: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts. 2023-04-22 21:13:44.274 : INFO: instruction count: 111: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.apply 2023-04-22 21:13:44.274 : INFO: instruction count: 17: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.apply 2023-04-22 21:13:44.274 : INFO: instruction count: 27: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2450DECODE_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.274 : INFO: instruction count: 44: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2451INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.274 : INFO: instruction count: 31: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2452INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:44.274 : INFO: instruction count: 10: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2453INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:13:44.274 : INFO: instruction count: 27: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2455DECODE_r_struct_of_r_struct_of_ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.274 : INFO: instruction count: 8: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2456INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:13:44.274 : INFO: instruction count: 102: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2458split_StreamLen 2023-04-22 21:13:44.274 : INFO: instruction count: 13: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2469ENCODE_SBaseStructPointer_TO_r_struct_of_r_int32END 2023-04-22 21:13:44.275 : INFO: instruction count: 4: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2470ENCODE_SInt32$_TO_r_int32 2023-04-22 21:13:44.275 : INFO: instruction count: 9: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.setPartitionIndex 2023-04-22 21:13:44.275 : INFO: instruction count: 4: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.addPartitionRegion 2023-04-22 21:13:44.275 : INFO: instruction count: 4: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.setPool 2023-04-22 21:13:44.275 : INFO: instruction count: 3: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.addHailClassLoader 2023-04-22 21:13:44.275 : INFO: instruction count: 3: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.addFS 2023-04-22 21:13:44.275 : INFO: instruction count: 4: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.addTaskContext 2023-04-22 21:13:44.275 : INFO: instruction count: 3: __C2448collect_distributed_array_matrix_block_matrix_writer_partition_counts.setObjects 2023-04-22 21:13:44.275 : INFO: instruction count: 3: __C2466staticWrapperClass_1. 2023-04-22 21:13:44.281 : INFO: encoder cache hit 2023-04-22 21:13:44.282 MemoryStore: INFO: Block broadcast_181 stored as values in memory (estimated size 688.0 B, free 25.0 GiB) 2023-04-22 21:13:44.283 MemoryStore: INFO: Block broadcast_181_piece0 stored as bytes in memory (estimated size 217.0 B, free 25.0 GiB) 2023-04-22 21:13:44.283 BlockManagerInfo: INFO: Added broadcast_181_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 217.0 B, free: 25.3 GiB) 2023-04-22 21:13:44.283 SparkContext: INFO: Created broadcast 181 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:44.283 : INFO: instruction count: 3: __C2433HailClassLoaderContainer. 2023-04-22 21:13:44.283 : INFO: instruction count: 3: __C2433HailClassLoaderContainer. 2023-04-22 21:13:44.283 : INFO: instruction count: 3: __C2435FSContainer. 2023-04-22 21:13:44.283 : INFO: instruction count: 3: __C2435FSContainer. 2023-04-22 21:13:44.286 : INFO: instruction count: 3: __C2437Compiled. 2023-04-22 21:13:44.287 : INFO: instruction count: 294: __C2437Compiled.apply 2023-04-22 21:13:44.287 : INFO: instruction count: 4: __C2437Compiled.setBackend 2023-04-22 21:13:44.287 : INFO: instruction count: 9: __C2437Compiled.__m2480ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND 2023-04-22 21:13:44.287 : INFO: instruction count: 49: __C2437Compiled.__m2481ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.287 : INFO: instruction count: 16: __C2437Compiled.__m2482ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:13:44.287 : INFO: instruction count: 4: __C2437Compiled.__m2483ENCODE_SInt32$_TO_r_int32 2023-04-22 21:13:44.287 : INFO: instruction count: 9: __C2437Compiled.__m2484ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_ENDEND 2023-04-22 21:13:44.287 : INFO: instruction count: 1: __C2437Compiled.__m2485ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:13:44.287 : INFO: instruction count: 27: __C2437Compiled.__m2488DECODE_r_struct_of_r_int32END_TO_SBaseStructPointer 2023-04-22 21:13:44.287 : INFO: instruction count: 10: __C2437Compiled.__m2489INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:13:44.287 : INFO: instruction count: 9: __C2437Compiled.setPartitionIndex 2023-04-22 21:13:44.287 : INFO: instruction count: 4: __C2437Compiled.addPartitionRegion 2023-04-22 21:13:44.287 : INFO: instruction count: 4: __C2437Compiled.setPool 2023-04-22 21:13:44.287 : INFO: instruction count: 3: __C2437Compiled.addHailClassLoader 2023-04-22 21:13:44.287 : INFO: instruction count: 3: __C2437Compiled.addFS 2023-04-22 21:13:44.287 : INFO: instruction count: 4: __C2437Compiled.addTaskContext 2023-04-22 21:13:44.287 : INFO: instruction count: 3: __C2437Compiled.setObjects 2023-04-22 21:13:44.287 : INFO: instruction count: 64: __C2437Compiled.addAndDecodeLiterals 2023-04-22 21:13:44.287 : INFO: instruction count: 36: __C2437Compiled.__m2494DECODE_r_struct_of_r_struct_of_ENDANDr_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.287 : INFO: instruction count: 8: __C2437Compiled.__m2495INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:13:44.287 : INFO: instruction count: 58: __C2437Compiled.__m2496INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.287 : INFO: instruction count: 44: __C2437Compiled.__m2497INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.287 : INFO: instruction count: 31: __C2437Compiled.__m2498INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:44.288 : INFO: instruction count: 3: __C2486staticWrapperClass_1. 2023-04-22 21:13:44.288 : INFO: initial IR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2700 __iruid_2701 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2700))) (NA String))) 2023-04-22 21:13:44.291 : INFO: after optimize: compileLowerer, initial IR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2725 __iruid_2726 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2725))) (NA String))) 2023-04-22 21:13:44.291 : INFO: after InlineApplyIR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2725 __iruid_2726 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2725))) (NA String))) 2023-04-22 21:13:44.295 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2731 __iruid_2732 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2731))) (NA String))) 2023-04-22 21:13:44.296 : INFO: after LowerArrayAggsToRunAggs: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2731 __iruid_2732 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2731))) (NA String))) 2023-04-22 21:13:44.307 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2737 __iruid_2738 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2737))) (NA String))) 2023-04-22 21:13:44.328 : INFO: instruction count: 3: __C2510HailClassLoaderContainer. 2023-04-22 21:13:44.328 : INFO: instruction count: 3: __C2510HailClassLoaderContainer. 2023-04-22 21:13:44.329 : INFO: instruction count: 3: __C2512FSContainer. 2023-04-22 21:13:44.330 : INFO: instruction count: 3: __C2512FSContainer. 2023-04-22 21:13:44.331 : INFO: instruction count: 3: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts. 2023-04-22 21:13:44.332 : INFO: instruction count: 111: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.apply 2023-04-22 21:13:44.332 : INFO: instruction count: 17: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.apply 2023-04-22 21:13:44.332 : INFO: instruction count: 27: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2516DECODE_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.332 : INFO: instruction count: 44: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2517INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.332 : INFO: instruction count: 31: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2518INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:44.332 : INFO: instruction count: 10: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2519INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:13:44.332 : INFO: instruction count: 27: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2521DECODE_r_struct_of_r_struct_of_ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.332 : INFO: instruction count: 8: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2522INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:13:44.332 : INFO: instruction count: 102: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2524split_StreamLen 2023-04-22 21:13:44.332 : INFO: instruction count: 13: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2535ENCODE_SBaseStructPointer_TO_r_struct_of_r_int32END 2023-04-22 21:13:44.332 : INFO: instruction count: 4: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2536ENCODE_SInt32$_TO_r_int32 2023-04-22 21:13:44.332 : INFO: instruction count: 9: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.setPartitionIndex 2023-04-22 21:13:44.332 : INFO: instruction count: 4: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.addPartitionRegion 2023-04-22 21:13:44.332 : INFO: instruction count: 4: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.setPool 2023-04-22 21:13:44.332 : INFO: instruction count: 3: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.addHailClassLoader 2023-04-22 21:13:44.332 : INFO: instruction count: 3: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.addFS 2023-04-22 21:13:44.332 : INFO: instruction count: 4: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.addTaskContext 2023-04-22 21:13:44.332 : INFO: instruction count: 3: __C2514collect_distributed_array_matrix_block_matrix_writer_partition_counts.setObjects 2023-04-22 21:13:44.332 : INFO: instruction count: 3: __C2532staticWrapperClass_1. 2023-04-22 21:13:44.337 : INFO: encoder cache hit 2023-04-22 21:13:44.338 MemoryStore: INFO: Block broadcast_182 stored as values in memory (estimated size 688.0 B, free 25.0 GiB) 2023-04-22 21:13:44.354 MemoryStore: INFO: Block broadcast_182_piece0 stored as bytes in memory (estimated size 217.0 B, free 25.0 GiB) 2023-04-22 21:13:44.354 BlockManagerInfo: INFO: Added broadcast_182_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 217.0 B, free: 25.3 GiB) 2023-04-22 21:13:44.369 SparkContext: INFO: Created broadcast 182 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:44.369 : INFO: instruction count: 3: __C2499HailClassLoaderContainer. 2023-04-22 21:13:44.369 : INFO: instruction count: 3: __C2499HailClassLoaderContainer. 2023-04-22 21:13:44.369 : INFO: instruction count: 3: __C2501FSContainer. 2023-04-22 21:13:44.369 : INFO: instruction count: 3: __C2501FSContainer. 2023-04-22 21:13:44.372 : INFO: instruction count: 3: __C2503Compiled. 2023-04-22 21:13:44.373 : INFO: instruction count: 294: __C2503Compiled.apply 2023-04-22 21:13:44.373 : INFO: instruction count: 4: __C2503Compiled.setBackend 2023-04-22 21:13:44.373 : INFO: instruction count: 9: __C2503Compiled.__m2546ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND 2023-04-22 21:13:44.373 : INFO: instruction count: 49: __C2503Compiled.__m2547ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.373 : INFO: instruction count: 16: __C2503Compiled.__m2548ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:13:44.373 : INFO: instruction count: 4: __C2503Compiled.__m2549ENCODE_SInt32$_TO_r_int32 2023-04-22 21:13:44.373 : INFO: instruction count: 9: __C2503Compiled.__m2550ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_ENDEND 2023-04-22 21:13:44.373 : INFO: instruction count: 1: __C2503Compiled.__m2551ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:13:44.373 : INFO: instruction count: 27: __C2503Compiled.__m2554DECODE_r_struct_of_r_int32END_TO_SBaseStructPointer 2023-04-22 21:13:44.373 : INFO: instruction count: 10: __C2503Compiled.__m2555INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:13:44.373 : INFO: instruction count: 9: __C2503Compiled.setPartitionIndex 2023-04-22 21:13:44.373 : INFO: instruction count: 4: __C2503Compiled.addPartitionRegion 2023-04-22 21:13:44.373 : INFO: instruction count: 4: __C2503Compiled.setPool 2023-04-22 21:13:44.373 : INFO: instruction count: 3: __C2503Compiled.addHailClassLoader 2023-04-22 21:13:44.373 : INFO: instruction count: 3: __C2503Compiled.addFS 2023-04-22 21:13:44.373 : INFO: instruction count: 4: __C2503Compiled.addTaskContext 2023-04-22 21:13:44.373 : INFO: instruction count: 3: __C2503Compiled.setObjects 2023-04-22 21:13:44.373 : INFO: instruction count: 64: __C2503Compiled.addAndDecodeLiterals 2023-04-22 21:13:44.373 : INFO: instruction count: 36: __C2503Compiled.__m2560DECODE_r_struct_of_r_struct_of_ENDANDr_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.373 : INFO: instruction count: 8: __C2503Compiled.__m2561INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:13:44.373 : INFO: instruction count: 58: __C2503Compiled.__m2562INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.373 : INFO: instruction count: 44: __C2503Compiled.__m2563INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.373 : INFO: instruction count: 31: __C2503Compiled.__m2564INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:44.373 : INFO: instruction count: 3: __C2552staticWrapperClass_1. 2023-04-22 21:13:44.374 : INFO: initial IR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2700 __iruid_2701 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2700))) (NA String))) 2023-04-22 21:13:44.376 : INFO: after optimize: compileLowerer, initial IR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2744 __iruid_2745 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2744))) (NA String))) 2023-04-22 21:13:44.377 : INFO: after InlineApplyIR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2744 __iruid_2745 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2744))) (NA String))) 2023-04-22 21:13:44.380 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2750 __iruid_2751 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2750))) (NA String))) 2023-04-22 21:13:44.381 : INFO: after LowerArrayAggsToRunAggs: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2750 __iruid_2751 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2750))) (NA String))) 2023-04-22 21:13:44.383 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 9: (MakeTuple (0) (CollectDistributedArray matrix_block_matrix_writer_partition_counts __iruid_2756 __iruid_2757 (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (Literal Struct{} ) (StreamLen (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2756))) (NA String))) 2023-04-22 21:13:44.388 : INFO: instruction count: 3: __C2576HailClassLoaderContainer. 2023-04-22 21:13:44.388 : INFO: instruction count: 3: __C2576HailClassLoaderContainer. 2023-04-22 21:13:44.388 : INFO: instruction count: 3: __C2578FSContainer. 2023-04-22 21:13:44.388 : INFO: instruction count: 3: __C2578FSContainer. 2023-04-22 21:13:44.405 : INFO: instruction count: 3: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts. 2023-04-22 21:13:44.405 : INFO: instruction count: 111: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.apply 2023-04-22 21:13:44.405 : INFO: instruction count: 17: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.apply 2023-04-22 21:13:44.406 : INFO: instruction count: 27: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2582DECODE_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.406 : INFO: instruction count: 44: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2583INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.406 : INFO: instruction count: 31: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2584INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:44.406 : INFO: instruction count: 10: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2585INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:13:44.406 : INFO: instruction count: 27: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2587DECODE_r_struct_of_r_struct_of_ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.406 : INFO: instruction count: 8: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2588INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:13:44.406 : INFO: instruction count: 102: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2590split_StreamLen 2023-04-22 21:13:44.406 : INFO: instruction count: 13: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2601ENCODE_SBaseStructPointer_TO_r_struct_of_r_int32END 2023-04-22 21:13:44.406 : INFO: instruction count: 4: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.__m2602ENCODE_SInt32$_TO_r_int32 2023-04-22 21:13:44.406 : INFO: instruction count: 9: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.setPartitionIndex 2023-04-22 21:13:44.406 : INFO: instruction count: 4: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.addPartitionRegion 2023-04-22 21:13:44.406 : INFO: instruction count: 4: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.setPool 2023-04-22 21:13:44.406 : INFO: instruction count: 3: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.addHailClassLoader 2023-04-22 21:13:44.406 : INFO: instruction count: 3: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.addFS 2023-04-22 21:13:44.406 : INFO: instruction count: 4: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.addTaskContext 2023-04-22 21:13:44.406 : INFO: instruction count: 3: __C2580collect_distributed_array_matrix_block_matrix_writer_partition_counts.setObjects 2023-04-22 21:13:44.406 : INFO: instruction count: 3: __C2598staticWrapperClass_1. 2023-04-22 21:13:44.411 : INFO: encoder cache hit 2023-04-22 21:13:44.411 MemoryStore: INFO: Block broadcast_183 stored as values in memory (estimated size 688.0 B, free 25.0 GiB) 2023-04-22 21:13:44.413 MemoryStore: INFO: Block broadcast_183_piece0 stored as bytes in memory (estimated size 217.0 B, free 25.0 GiB) 2023-04-22 21:13:44.414 BlockManagerInfo: INFO: Added broadcast_183_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 217.0 B, free: 25.3 GiB) 2023-04-22 21:13:44.427 SparkContext: INFO: Created broadcast 183 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:44.427 : INFO: instruction count: 3: __C2565HailClassLoaderContainer. 2023-04-22 21:13:44.427 : INFO: instruction count: 3: __C2565HailClassLoaderContainer. 2023-04-22 21:13:44.427 : INFO: instruction count: 3: __C2567FSContainer. 2023-04-22 21:13:44.427 : INFO: instruction count: 3: __C2567FSContainer. 2023-04-22 21:13:44.430 : INFO: instruction count: 3: __C2569Compiled. 2023-04-22 21:13:44.430 : INFO: instruction count: 294: __C2569Compiled.apply 2023-04-22 21:13:44.430 : INFO: instruction count: 4: __C2569Compiled.setBackend 2023-04-22 21:13:44.431 : INFO: instruction count: 9: __C2569Compiled.__m2612ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND 2023-04-22 21:13:44.431 : INFO: instruction count: 49: __C2569Compiled.__m2613ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.431 : INFO: instruction count: 16: __C2569Compiled.__m2614ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:13:44.431 : INFO: instruction count: 4: __C2569Compiled.__m2615ENCODE_SInt32$_TO_r_int32 2023-04-22 21:13:44.431 : INFO: instruction count: 9: __C2569Compiled.__m2616ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_ENDEND 2023-04-22 21:13:44.431 : INFO: instruction count: 1: __C2569Compiled.__m2617ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:13:44.431 : INFO: instruction count: 27: __C2569Compiled.__m2620DECODE_r_struct_of_r_int32END_TO_SBaseStructPointer 2023-04-22 21:13:44.431 : INFO: instruction count: 10: __C2569Compiled.__m2621INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:13:44.431 : INFO: instruction count: 9: __C2569Compiled.setPartitionIndex 2023-04-22 21:13:44.431 : INFO: instruction count: 4: __C2569Compiled.addPartitionRegion 2023-04-22 21:13:44.431 : INFO: instruction count: 4: __C2569Compiled.setPool 2023-04-22 21:13:44.431 : INFO: instruction count: 3: __C2569Compiled.addHailClassLoader 2023-04-22 21:13:44.431 : INFO: instruction count: 3: __C2569Compiled.addFS 2023-04-22 21:13:44.431 : INFO: instruction count: 4: __C2569Compiled.addTaskContext 2023-04-22 21:13:44.431 : INFO: instruction count: 3: __C2569Compiled.setObjects 2023-04-22 21:13:44.431 : INFO: instruction count: 64: __C2569Compiled.addAndDecodeLiterals 2023-04-22 21:13:44.431 : INFO: instruction count: 36: __C2569Compiled.__m2626DECODE_r_struct_of_r_struct_of_ENDANDr_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:13:44.431 : INFO: instruction count: 8: __C2569Compiled.__m2627INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:13:44.431 : INFO: instruction count: 58: __C2569Compiled.__m2628INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.431 : INFO: instruction count: 44: __C2569Compiled.__m2629INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:13:44.431 : INFO: instruction count: 31: __C2569Compiled.__m2630INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:13:44.431 : INFO: instruction count: 3: __C2618staticWrapperClass_1. 2023-04-22 21:13:44.434 : INFO: executing D-Array [matrix_block_matrix_writer_partition_counts] with 8 tasks 2023-04-22 21:13:44.434 MemoryStore: INFO: Block broadcast_184 stored as values in memory (estimated size 64.0 B, free 25.0 GiB) 2023-04-22 21:13:44.435 MemoryStore: INFO: Block broadcast_184_piece0 stored as bytes in memory (estimated size 49.0 B, free 25.0 GiB) 2023-04-22 21:13:44.435 BlockManagerInfo: INFO: Added broadcast_184_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 49.0 B, free: 25.3 GiB) 2023-04-22 21:13:44.436 SparkContext: INFO: Created broadcast 184 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:44.437 MemoryStore: INFO: Block broadcast_185 stored as values in memory (estimated size 429.5 KiB, free 25.0 GiB) 2023-04-22 21:13:44.461 MemoryStore: INFO: Block broadcast_185_piece0 stored as bytes in memory (estimated size 32.4 KiB, free 25.0 GiB) 2023-04-22 21:13:44.461 BlockManagerInfo: INFO: Added broadcast_185_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 32.4 KiB, free: 25.3 GiB) 2023-04-22 21:13:44.461 SparkContext: INFO: Created broadcast 185 from broadcast at SparkBackend.scala:354 2023-04-22 21:13:44.586 SparkContext: INFO: Starting job: collect at SparkBackend.scala:368 2023-04-22 21:13:44.586 DAGScheduler: INFO: Got job 45 (collect at SparkBackend.scala:368) with 8 output partitions 2023-04-22 21:13:44.586 DAGScheduler: INFO: Final stage: ResultStage 84 (collect at SparkBackend.scala:368) 2023-04-22 21:13:44.586 DAGScheduler: INFO: Parents of final stage: List() 2023-04-22 21:13:44.586 DAGScheduler: INFO: Missing parents: List() 2023-04-22 21:13:44.587 DAGScheduler: INFO: Submitting ResultStage 84 (SparkBackendComputeRDD[186] at RDD at SparkBackend.scala:784), which has no missing parents 2023-04-22 21:13:44.622 MemoryStore: INFO: Block broadcast_186 stored as values in memory (estimated size 746.2 KiB, free 25.0 GiB) 2023-04-22 21:13:44.627 MemoryStore: INFO: Block broadcast_186_piece0 stored as bytes in memory (estimated size 375.3 KiB, free 25.0 GiB) 2023-04-22 21:13:44.628 BlockManagerInfo: INFO: Added broadcast_186_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 375.3 KiB, free: 25.3 GiB) 2023-04-22 21:13:44.628 SparkContext: INFO: Created broadcast 186 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:13:44.628 DAGScheduler: INFO: Submitting 8 missing tasks from ResultStage 84 (SparkBackendComputeRDD[186] at RDD at SparkBackend.scala:784) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7)) 2023-04-22 21:13:44.628 TaskSchedulerImpl: INFO: Adding task set 84.0 with 8 tasks resource profile 0 2023-04-22 21:13:44.629 TaskSetManager: INFO: Starting task 0.0 in stage 84.0 (TID 440) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:13:44.629 Executor: INFO: Running task 0.0 in stage 84.0 (TID 440) 2023-04-22 21:13:44.662 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 0.0 in stage 84.0 (TID 440) 2023-04-22 21:13:49.197 : INFO: TaskReport: stage=84, partition=0, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:49.197 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 84.0 (TID 440) 2023-04-22 21:13:49.197 Executor: INFO: Finished task 0.0 in stage 84.0 (TID 440). 822 bytes result sent to driver 2023-04-22 21:13:49.198 TaskSetManager: INFO: Starting task 1.0 in stage 84.0 (TID 441) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:13:49.198 TaskSetManager: INFO: Finished task 0.0 in stage 84.0 (TID 440) in 4569 ms on uger-c010.broadinstitute.org (executor driver) (1/8) 2023-04-22 21:13:49.198 Executor: INFO: Running task 1.0 in stage 84.0 (TID 441) 2023-04-22 21:13:49.220 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 1.0 in stage 84.0 (TID 441) 2023-04-22 21:13:53.643 : INFO: TaskReport: stage=84, partition=1, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:53.643 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 359: Executor task launch worker for task 1.0 in stage 84.0 (TID 441) 2023-04-22 21:13:53.643 Executor: INFO: Finished task 1.0 in stage 84.0 (TID 441). 822 bytes result sent to driver 2023-04-22 21:13:53.644 TaskSetManager: INFO: Starting task 2.0 in stage 84.0 (TID 442) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:13:53.644 TaskSetManager: INFO: Finished task 1.0 in stage 84.0 (TID 441) in 4447 ms on uger-c010.broadinstitute.org (executor driver) (2/8) 2023-04-22 21:13:53.644 Executor: INFO: Running task 2.0 in stage 84.0 (TID 442) 2023-04-22 21:13:53.666 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 2.0 in stage 84.0 (TID 442) 2023-04-22 21:13:58.090 : INFO: TaskReport: stage=84, partition=2, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:13:58.090 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 359: Executor task launch worker for task 2.0 in stage 84.0 (TID 442) 2023-04-22 21:13:58.090 Executor: INFO: Finished task 2.0 in stage 84.0 (TID 442). 822 bytes result sent to driver 2023-04-22 21:13:58.091 TaskSetManager: INFO: Starting task 3.0 in stage 84.0 (TID 443) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:13:58.091 TaskSetManager: INFO: Finished task 2.0 in stage 84.0 (TID 442) in 4447 ms on uger-c010.broadinstitute.org (executor driver) (3/8) 2023-04-22 21:13:58.091 Executor: INFO: Running task 3.0 in stage 84.0 (TID 443) 2023-04-22 21:13:58.112 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 3.0 in stage 84.0 (TID 443) 2023-04-22 21:14:02.640 : INFO: TaskReport: stage=84, partition=3, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:14:02.640 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 359: Executor task launch worker for task 3.0 in stage 84.0 (TID 443) 2023-04-22 21:14:02.640 Executor: INFO: Finished task 3.0 in stage 84.0 (TID 443). 822 bytes result sent to driver 2023-04-22 21:14:02.642 TaskSetManager: INFO: Starting task 4.0 in stage 84.0 (TID 444) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:14:02.642 TaskSetManager: INFO: Finished task 3.0 in stage 84.0 (TID 443) in 4552 ms on uger-c010.broadinstitute.org (executor driver) (4/8) 2023-04-22 21:14:02.645 Executor: INFO: Running task 4.0 in stage 84.0 (TID 444) 2023-04-22 21:14:02.666 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 4.0 in stage 84.0 (TID 444) 2023-04-22 21:14:07.088 : INFO: TaskReport: stage=84, partition=4, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:14:07.088 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 359: Executor task launch worker for task 4.0 in stage 84.0 (TID 444) 2023-04-22 21:14:07.088 Executor: INFO: Finished task 4.0 in stage 84.0 (TID 444). 822 bytes result sent to driver 2023-04-22 21:14:07.088 TaskSetManager: INFO: Starting task 5.0 in stage 84.0 (TID 445) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:14:07.089 TaskSetManager: INFO: Finished task 4.0 in stage 84.0 (TID 444) in 4447 ms on uger-c010.broadinstitute.org (executor driver) (5/8) 2023-04-22 21:14:07.089 Executor: INFO: Running task 5.0 in stage 84.0 (TID 445) 2023-04-22 21:14:07.110 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 5.0 in stage 84.0 (TID 445) 2023-04-22 21:14:11.524 : INFO: TaskReport: stage=84, partition=5, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:14:11.524 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 359: Executor task launch worker for task 5.0 in stage 84.0 (TID 445) 2023-04-22 21:14:11.525 Executor: INFO: Finished task 5.0 in stage 84.0 (TID 445). 822 bytes result sent to driver 2023-04-22 21:14:11.525 TaskSetManager: INFO: Starting task 6.0 in stage 84.0 (TID 446) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:14:11.525 TaskSetManager: INFO: Finished task 5.0 in stage 84.0 (TID 445) in 4437 ms on uger-c010.broadinstitute.org (executor driver) (6/8) 2023-04-22 21:14:11.526 Executor: INFO: Running task 6.0 in stage 84.0 (TID 446) 2023-04-22 21:14:11.547 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 6.0 in stage 84.0 (TID 446) 2023-04-22 21:14:15.965 : INFO: TaskReport: stage=84, partition=6, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:14:15.965 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 84.0 (TID 446) 2023-04-22 21:14:15.966 Executor: INFO: Finished task 6.0 in stage 84.0 (TID 446). 822 bytes result sent to driver 2023-04-22 21:14:15.967 TaskSetManager: INFO: Starting task 7.0 in stage 84.0 (TID 447) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4372 bytes) taskResourceAssignments Map() 2023-04-22 21:14:15.967 TaskSetManager: INFO: Finished task 6.0 in stage 84.0 (TID 446) in 4442 ms on uger-c010.broadinstitute.org (executor driver) (7/8) 2023-04-22 21:14:15.981 Executor: INFO: Running task 7.0 in stage 84.0 (TID 447) 2023-04-22 21:14:16.011 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 7.0 in stage 84.0 (TID 447) 2023-04-22 21:14:20.482 : INFO: TaskReport: stage=84, partition=7, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:14:20.482 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 359: Executor task launch worker for task 7.0 in stage 84.0 (TID 447) 2023-04-22 21:14:20.483 Executor: INFO: Finished task 7.0 in stage 84.0 (TID 447). 822 bytes result sent to driver 2023-04-22 21:14:20.483 TaskSetManager: INFO: Finished task 7.0 in stage 84.0 (TID 447) in 4516 ms on uger-c010.broadinstitute.org (executor driver) (8/8) 2023-04-22 21:14:20.483 TaskSchedulerImpl: INFO: Removed TaskSet 84.0, whose tasks have all completed, from pool 2023-04-22 21:14:20.483 DAGScheduler: INFO: ResultStage 84 (collect at SparkBackend.scala:368) finished in 35.896 s 2023-04-22 21:14:20.483 DAGScheduler: INFO: Job 45 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:14:20.483 TaskSchedulerImpl: INFO: Killing all running tasks in stage 84: Stage finished 2023-04-22 21:14:20.486 DAGScheduler: INFO: Job 45 finished: collect at SparkBackend.scala:368, took 35.900211 s 2023-04-22 21:14:20.486 : INFO: executed D-Array [matrix_block_matrix_writer_partition_counts] in 36.052s 2023-04-22 21:14:20.569 : INFO: compiling and evaluating result: TableWrite 2023-04-22 21:14:20.899 : INFO: initial IR: IR size 272: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_2794 (ToStream False (ArraySort __iruid_2792 __iruid_2793 (StreamFlatMap __iruid_2791 (ToStream False (Let __iruid_2668 (Literal Struct{__cols:Array[Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean}]} ) (Let __iruid_2790 (CollectDistributedArray matrix_block_matrix_writer __iruid_2779 __iruid_2786 (StreamFlatMap __iruid_2778 (StreamZip -1 AssertSameLength (__iruid_2775 __iruid_2776) (Let __iruid_2767 (ToArray (StreamZip -1 AssertSameLength (__iruid_2759 __iruid_2760 __iruid_2761) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_2759)) (mwStartIdx (Cast Int32 (Ref __iruid_2760))) (mwStopIdx (Cast Int32 (Ref __iruid_2761)))))) (StreamMap __iruid_2768 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_2768))) (oldContexts (ToArray (StreamMap __iruid_2769 (ToStream False (GetField parentPartitions (Ref __iruid_2768))) (ArrayRef -1 (Ref __iruid_2767) (Ref __iruid_2769)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_2777 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_2775)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_2777) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_2777) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_2777)) (blockRowIdx (Ref __iruid_2776))))) (Ref __iruid_2778)) (MakeStruct (__iruid_2668 (Ref __iruid_2668))) (Let __iruid_2668 (GetField __iruid_2668 (Ref __iruid_2786)) (ToArray (StreamMap __iruid_2787 (Let __iruid_2781 (GetField blockStart (Ref __iruid_2779)) (Let __iruid_2784 (ToArray (StreamMap __iruid_2782 (Let __iruid_2780 (GetField oldTableCtx (Ref __iruid_2779)) (Let __iruid_2771 (Ref __iruid_2780) (Let __iruid_2772 (GetField partitionBound (Ref __iruid_2771)) (StreamTakeWhile __iruid_2774 (StreamDropWhile __iruid_2773 (StreamFlatMap __iruid_2770 (ToStream True (GetField oldContexts (Ref __iruid_2771))) (Let __iruid_2762 (Ref __iruid_2770) (StreamZip -1 AssertSameLength (__iruid_2765 __iruid_2766) (Let __iruid_2763 (GetField mwOld (Ref __iruid_2762)) (Let __iruid_2669 (Ref __iruid_2763) (Let global (Ref __iruid_2668) (StreamMap __iruid_2670 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (Ref __iruid_2669)) (Let row (Ref __iruid_2670) (Let __iruid_2659 (ToArray (StreamMap __iruid_2660 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref row))) (Let __iruid_2661 (InsertFields (SelectFields () (Ref __iruid_2660)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_2660))))) (If (IsNA (Ref __iruid_2661)) (Literal Struct{__gt:Int32} ) (Ref __iruid_2661))))) (Let __iruid_2662 (StreamAgg __iruid_2663 (StreamFilter __iruid_2664 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2659)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_2659) (Ref __iruid_2664))))) (InsertFields (SelectFields (locus alleles rsid cm_position) (Ref row)) None (__mean_gt (AggLet __iruid_2665 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_2659) (Ref __iruid_2663)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_2665))) (Let __iruid_2666 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_2665)))))) (Cast Float64 (Ref __iruid_2666)))))))) (InsertFields (Ref __iruid_2662) ("locus" "alleles" "rsid" "cm_position" "__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_2667 (ToStream False (Ref __iruid_2659)) (InsertFields (SelectFields () (Ref __iruid_2667)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_2667))) (GetField __mean_gt (Ref __iruid_2662)))))))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_2762)) (GetField mwStopIdx (Ref __iruid_2762)) (I32 1)) (InsertFields (Ref __iruid_2765) None (__iruid_2764 (Ref __iruid_2766)))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_2773)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_2772)) (Apply -1 includesStart () Boolean (Ref __iruid_2772)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_2774)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_2772)) (Apply -1 includesEnd () Boolean (Ref __iruid_2772))))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_2782))) (rowOfData (ToArray (StreamMap __iruid_2783 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_2782)) (Ref __iruid_2781) (ApplyBinaryPrimOp Add (Ref __iruid_2781) (GetField blockSize (Ref __iruid_2779))) (I32 1))) (GetField __uid_6 (Ref __iruid_2783)))))))) (MakeStream Stream[Struct{__iruid_2764:Int32,blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (__iruid_2764 (GetField __iruid_2764 (ArrayRef -1 (Ref __iruid_2784) (I32 0)))) (blockRowIdx (GetField blockRowIdx (Ref __iruid_2779))) (blockColIdx (GetField blockColIdx (Ref __iruid_2779))) (ndBlock (MakeNDArray -1 (ToArray (StreamFlatMap __iruid_2785 (ToStream False (Ref __iruid_2784)) (ToStream False (GetField rowOfData (Ref __iruid_2785))))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_2784))) (Cast Int64 (GetField blockSize (Ref __iruid_2779)))) (True))))))) (Let __iruid_2788 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_2787)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_2787)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_2788) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_2787)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_2788))) (Str "-")) (UUID4 __iruid_2789)))))))) (NA String)) (Ref __iruid_2790)))) (ToStream False (Ref __iruid_2791))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_2792)) (GetTupleElement 0 (Ref __iruid_2793))))) (GetTupleElement 1 (Ref __iruid_2794)))))) 2023-04-22 21:14:20.964 : INFO: Prune: MakeStruct: eliminating field '__iruid_2764' 2023-04-22 21:14:21.003 : INFO: Prune: MakeStruct: eliminating field '__iruid_2668' 2023-04-22 21:14:21.050 : INFO: after optimize: relationalLowerer, initial IR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_2889 (ToStream False (ArraySort __iruid_2890 __iruid_2891 (StreamFlatMap __iruid_2892 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_2893 __iruid_2894 (StreamFlatMap __iruid_2895 (StreamZip -1 AssertSameLength (__iruid_2896 __iruid_2897) (Let __iruid_2898 (ToArray (StreamZip -1 AssertSameLength (__iruid_2899 __iruid_2900 __iruid_2901) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_2899)) (mwStartIdx (Cast Int32 (Ref __iruid_2900))) (mwStopIdx (Cast Int32 (Ref __iruid_2901)))))) (StreamMap __iruid_2902 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_2902))) (oldContexts (ToArray (StreamMap __iruid_2903 (ToStream False (GetField parentPartitions (Ref __iruid_2902))) (ArrayRef -1 (Ref __iruid_2898) (Ref __iruid_2903)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_2904 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_2896)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_2904) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_2904) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_2904)) (blockRowIdx (Ref __iruid_2897))))) (Ref __iruid_2895)) (Literal Struct{} ) (ToArray (StreamMap __iruid_2905 (Let __iruid_2906 (GetField blockStart (Ref __iruid_2893)) (Let __iruid_2907 (ToArray (StreamMap __iruid_2908 (Let __iruid_2909 (GetField oldTableCtx (Ref __iruid_2893)) (Let __iruid_2910 (GetField partitionBound (Ref __iruid_2909)) (StreamTakeWhile __iruid_2911 (StreamDropWhile __iruid_2912 (StreamFlatMap __iruid_2913 (ToStream True (GetField oldContexts (Ref __iruid_2909))) (StreamZip -1 AssertSameLength (__iruid_2914 __iruid_2915) (StreamMap __iruid_2916 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_2913))) (Let __iruid_2917 (ToArray (StreamMap __iruid_2918 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_2916))) (Let __iruid_2919 (InsertFields (SelectFields () (Ref __iruid_2918)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_2918))))) (If (IsNA (Ref __iruid_2919)) (Literal Struct{__gt:Int32} ) (Ref __iruid_2919))))) (Let __iruid_2920 (StreamAgg __iruid_2921 (StreamFilter __iruid_2922 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2917)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_2917) (Ref __iruid_2922))))) (InsertFields (SelectFields () (Ref __iruid_2916)) None (__mean_gt (AggLet __iruid_2923 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_2917) (Ref __iruid_2921)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_2923))) (Let __iruid_2924 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_2923)))))) (Cast Float64 (Ref __iruid_2924)))))))) (InsertFields (Ref __iruid_2920) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_2925 (ToStream False (Ref __iruid_2917)) (InsertFields (SelectFields () (Ref __iruid_2925)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_2925))) (GetField __mean_gt (Ref __iruid_2920)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_2913)) (GetField mwStopIdx (Ref __iruid_2913)) (I32 1)) (InsertFields (Ref __iruid_2914) None (__iruid_2764 (Ref __iruid_2915))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_2912)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_2910)) (Apply -1 includesStart () Boolean (Ref __iruid_2910)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_2911)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_2910)) (Apply -1 includesEnd () Boolean (Ref __iruid_2910)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_2908))) (rowOfData (ToArray (StreamMap __iruid_2926 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_2908)) (Ref __iruid_2906) (ApplyBinaryPrimOp Add (Ref __iruid_2906) (GetField blockSize (Ref __iruid_2893))) (I32 1))) (GetField __uid_6 (Ref __iruid_2926)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_2893))) (blockColIdx (GetField blockColIdx (Ref __iruid_2893))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_2927 (ToStream False (Ref __iruid_2907)) (ToStream False (GetField rowOfData (Ref __iruid_2927)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_2907))) (Cast Int64 (GetField blockSize (Ref __iruid_2893)))) (True))))))) (Let __iruid_2928 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_2905)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_2905)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_2928) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_2905)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_2928))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_2892))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_2890)) (GetTupleElement 0 (Ref __iruid_2891))))) (GetTupleElement 1 (Ref __iruid_2889)))))) 2023-04-22 21:14:21.072 : INFO: after LowerMatrixToTable: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_2889 (ToStream False (ArraySort __iruid_2890 __iruid_2891 (StreamFlatMap __iruid_2892 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_2893 __iruid_2894 (StreamFlatMap __iruid_2895 (StreamZip -1 AssertSameLength (__iruid_2896 __iruid_2897) (Let __iruid_2898 (ToArray (StreamZip -1 AssertSameLength (__iruid_2899 __iruid_2900 __iruid_2901) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_2899)) (mwStartIdx (Cast Int32 (Ref __iruid_2900))) (mwStopIdx (Cast Int32 (Ref __iruid_2901)))))) (StreamMap __iruid_2902 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_2902))) (oldContexts (ToArray (StreamMap __iruid_2903 (ToStream False (GetField parentPartitions (Ref __iruid_2902))) (ArrayRef -1 (Ref __iruid_2898) (Ref __iruid_2903)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_2904 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_2896)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_2904) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_2904) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_2904)) (blockRowIdx (Ref __iruid_2897))))) (Ref __iruid_2895)) (Literal Struct{} ) (ToArray (StreamMap __iruid_2905 (Let __iruid_2906 (GetField blockStart (Ref __iruid_2893)) (Let __iruid_2907 (ToArray (StreamMap __iruid_2908 (Let __iruid_2909 (GetField oldTableCtx (Ref __iruid_2893)) (Let __iruid_2910 (GetField partitionBound (Ref __iruid_2909)) (StreamTakeWhile __iruid_2911 (StreamDropWhile __iruid_2912 (StreamFlatMap __iruid_2913 (ToStream True (GetField oldContexts (Ref __iruid_2909))) (StreamZip -1 AssertSameLength (__iruid_2914 __iruid_2915) (StreamMap __iruid_2916 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_2913))) (Let __iruid_2917 (ToArray (StreamMap __iruid_2918 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_2916))) (Let __iruid_2919 (InsertFields (SelectFields () (Ref __iruid_2918)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_2918))))) (If (IsNA (Ref __iruid_2919)) (Literal Struct{__gt:Int32} ) (Ref __iruid_2919))))) (Let __iruid_2920 (StreamAgg __iruid_2921 (StreamFilter __iruid_2922 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_2917)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_2917) (Ref __iruid_2922))))) (InsertFields (SelectFields () (Ref __iruid_2916)) None (__mean_gt (AggLet __iruid_2923 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_2917) (Ref __iruid_2921)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_2923))) (Let __iruid_2924 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_2923)))))) (Cast Float64 (Ref __iruid_2924)))))))) (InsertFields (Ref __iruid_2920) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_2925 (ToStream False (Ref __iruid_2917)) (InsertFields (SelectFields () (Ref __iruid_2925)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_2925))) (GetField __mean_gt (Ref __iruid_2920)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_2913)) (GetField mwStopIdx (Ref __iruid_2913)) (I32 1)) (InsertFields (Ref __iruid_2914) None (__iruid_2764 (Ref __iruid_2915))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_2912)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_2910)) (Apply -1 includesStart () Boolean (Ref __iruid_2910)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_2911)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_2910)) (Apply -1 includesEnd () Boolean (Ref __iruid_2910)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_2908))) (rowOfData (ToArray (StreamMap __iruid_2926 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_2908)) (Ref __iruid_2906) (ApplyBinaryPrimOp Add (Ref __iruid_2906) (GetField blockSize (Ref __iruid_2893))) (I32 1))) (GetField __uid_6 (Ref __iruid_2926)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_2893))) (blockColIdx (GetField blockColIdx (Ref __iruid_2893))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_2927 (ToStream False (Ref __iruid_2907)) (ToStream False (GetField rowOfData (Ref __iruid_2927)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_2907))) (Cast Int64 (GetField blockSize (Ref __iruid_2893)))) (True))))))) (Let __iruid_2928 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_2905)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_2905)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_2928) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_2905)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_2928))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_2892))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_2890)) (GetTupleElement 0 (Ref __iruid_2891))))) (GetTupleElement 1 (Ref __iruid_2889)))))) 2023-04-22 21:14:21.171 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3009 (ToStream False (ArraySort __iruid_3010 __iruid_3011 (StreamFlatMap __iruid_3012 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3013 __iruid_3014 (StreamFlatMap __iruid_3015 (StreamZip -1 AssertSameLength (__iruid_3016 __iruid_3017) (Let __iruid_3018 (ToArray (StreamZip -1 AssertSameLength (__iruid_3019 __iruid_3020 __iruid_3021) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3019)) (mwStartIdx (Cast Int32 (Ref __iruid_3020))) (mwStopIdx (Cast Int32 (Ref __iruid_3021)))))) (StreamMap __iruid_3022 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3022))) (oldContexts (ToArray (StreamMap __iruid_3023 (ToStream False (GetField parentPartitions (Ref __iruid_3022))) (ArrayRef -1 (Ref __iruid_3018) (Ref __iruid_3023)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3024 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3016)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3024) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3024) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3024)) (blockRowIdx (Ref __iruid_3017))))) (Ref __iruid_3015)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3025 (Let __iruid_3026 (GetField blockStart (Ref __iruid_3013)) (Let __iruid_3027 (ToArray (StreamMap __iruid_3028 (Let __iruid_3029 (GetField oldTableCtx (Ref __iruid_3013)) (Let __iruid_3030 (GetField partitionBound (Ref __iruid_3029)) (StreamTakeWhile __iruid_3031 (StreamDropWhile __iruid_3032 (StreamFlatMap __iruid_3033 (ToStream True (GetField oldContexts (Ref __iruid_3029))) (StreamZip -1 AssertSameLength (__iruid_3034 __iruid_3035) (StreamMap __iruid_3036 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3033))) (Let __iruid_3037 (ToArray (StreamMap __iruid_3038 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3036))) (Let __iruid_3039 (InsertFields (SelectFields () (Ref __iruid_3038)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3038))))) (If (IsNA (Ref __iruid_3039)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3039))))) (Let __iruid_3040 (StreamAgg __iruid_3041 (StreamFilter __iruid_3042 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3037)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3037) (Ref __iruid_3042))))) (InsertFields (SelectFields () (Ref __iruid_3036)) None (__mean_gt (AggLet __iruid_3043 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3037) (Ref __iruid_3041)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3043))) (Let __iruid_3044 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3043)))))) (Cast Float64 (Ref __iruid_3044)))))))) (InsertFields (Ref __iruid_3040) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3045 (ToStream False (Ref __iruid_3037)) (InsertFields (SelectFields () (Ref __iruid_3045)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3045))) (GetField __mean_gt (Ref __iruid_3040)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3033)) (GetField mwStopIdx (Ref __iruid_3033)) (I32 1)) (InsertFields (Ref __iruid_3034) None (__iruid_2764 (Ref __iruid_3035))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3032)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3030)) (Apply -1 includesStart () Boolean (Ref __iruid_3030)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3031)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3030)) (Apply -1 includesEnd () Boolean (Ref __iruid_3030)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3028))) (rowOfData (ToArray (StreamMap __iruid_3046 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3028)) (Ref __iruid_3026) (ApplyBinaryPrimOp Add (Ref __iruid_3026) (GetField blockSize (Ref __iruid_3013))) (I32 1))) (GetField __uid_6 (Ref __iruid_3046)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3013))) (blockColIdx (GetField blockColIdx (Ref __iruid_3013))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3047 (ToStream False (Ref __iruid_3027)) (ToStream False (GetField rowOfData (Ref __iruid_3047)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3027))) (Cast Int64 (GetField blockSize (Ref __iruid_3013)))) (True))))))) (Let __iruid_3048 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3025)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3025)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3048) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3025)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3048))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3012))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3010)) (GetTupleElement 0 (Ref __iruid_3011))))) (GetTupleElement 1 (Ref __iruid_3009)))))) 2023-04-22 21:14:21.178 : INFO: after LiftRelationalValuesToRelationalLets: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3009 (ToStream False (ArraySort __iruid_3010 __iruid_3011 (StreamFlatMap __iruid_3012 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3013 __iruid_3014 (StreamFlatMap __iruid_3015 (StreamZip -1 AssertSameLength (__iruid_3016 __iruid_3017) (Let __iruid_3018 (ToArray (StreamZip -1 AssertSameLength (__iruid_3019 __iruid_3020 __iruid_3021) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3019)) (mwStartIdx (Cast Int32 (Ref __iruid_3020))) (mwStopIdx (Cast Int32 (Ref __iruid_3021)))))) (StreamMap __iruid_3022 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3022))) (oldContexts (ToArray (StreamMap __iruid_3023 (ToStream False (GetField parentPartitions (Ref __iruid_3022))) (ArrayRef -1 (Ref __iruid_3018) (Ref __iruid_3023)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3024 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3016)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3024) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3024) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3024)) (blockRowIdx (Ref __iruid_3017))))) (Ref __iruid_3015)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3025 (Let __iruid_3026 (GetField blockStart (Ref __iruid_3013)) (Let __iruid_3027 (ToArray (StreamMap __iruid_3028 (Let __iruid_3029 (GetField oldTableCtx (Ref __iruid_3013)) (Let __iruid_3030 (GetField partitionBound (Ref __iruid_3029)) (StreamTakeWhile __iruid_3031 (StreamDropWhile __iruid_3032 (StreamFlatMap __iruid_3033 (ToStream True (GetField oldContexts (Ref __iruid_3029))) (StreamZip -1 AssertSameLength (__iruid_3034 __iruid_3035) (StreamMap __iruid_3036 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3033))) (Let __iruid_3037 (ToArray (StreamMap __iruid_3038 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3036))) (Let __iruid_3039 (InsertFields (SelectFields () (Ref __iruid_3038)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3038))))) (If (IsNA (Ref __iruid_3039)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3039))))) (Let __iruid_3040 (StreamAgg __iruid_3041 (StreamFilter __iruid_3042 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3037)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3037) (Ref __iruid_3042))))) (InsertFields (SelectFields () (Ref __iruid_3036)) None (__mean_gt (AggLet __iruid_3043 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3037) (Ref __iruid_3041)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3043))) (Let __iruid_3044 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3043)))))) (Cast Float64 (Ref __iruid_3044)))))))) (InsertFields (Ref __iruid_3040) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3045 (ToStream False (Ref __iruid_3037)) (InsertFields (SelectFields () (Ref __iruid_3045)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3045))) (GetField __mean_gt (Ref __iruid_3040)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3033)) (GetField mwStopIdx (Ref __iruid_3033)) (I32 1)) (InsertFields (Ref __iruid_3034) None (__iruid_2764 (Ref __iruid_3035))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3032)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3030)) (Apply -1 includesStart () Boolean (Ref __iruid_3030)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3031)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3030)) (Apply -1 includesEnd () Boolean (Ref __iruid_3030)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3028))) (rowOfData (ToArray (StreamMap __iruid_3046 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3028)) (Ref __iruid_3026) (ApplyBinaryPrimOp Add (Ref __iruid_3026) (GetField blockSize (Ref __iruid_3013))) (I32 1))) (GetField __uid_6 (Ref __iruid_3046)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3013))) (blockColIdx (GetField blockColIdx (Ref __iruid_3013))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3047 (ToStream False (Ref __iruid_3027)) (ToStream False (GetField rowOfData (Ref __iruid_3047)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3027))) (Cast Int64 (GetField blockSize (Ref __iruid_3013)))) (True))))))) (Let __iruid_3048 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3025)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3025)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3048) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3025)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3048))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3012))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3010)) (GetTupleElement 0 (Ref __iruid_3011))))) (GetTupleElement 1 (Ref __iruid_3009)))))) 2023-04-22 21:14:21.197 : INFO: after EvalRelationalLets: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3009 (ToStream False (ArraySort __iruid_3010 __iruid_3011 (StreamFlatMap __iruid_3012 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3013 __iruid_3014 (StreamFlatMap __iruid_3015 (StreamZip -1 AssertSameLength (__iruid_3016 __iruid_3017) (Let __iruid_3018 (ToArray (StreamZip -1 AssertSameLength (__iruid_3019 __iruid_3020 __iruid_3021) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3019)) (mwStartIdx (Cast Int32 (Ref __iruid_3020))) (mwStopIdx (Cast Int32 (Ref __iruid_3021)))))) (StreamMap __iruid_3022 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3022))) (oldContexts (ToArray (StreamMap __iruid_3023 (ToStream False (GetField parentPartitions (Ref __iruid_3022))) (ArrayRef -1 (Ref __iruid_3018) (Ref __iruid_3023)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3024 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3016)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3024) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3024) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3024)) (blockRowIdx (Ref __iruid_3017))))) (Ref __iruid_3015)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3025 (Let __iruid_3026 (GetField blockStart (Ref __iruid_3013)) (Let __iruid_3027 (ToArray (StreamMap __iruid_3028 (Let __iruid_3029 (GetField oldTableCtx (Ref __iruid_3013)) (Let __iruid_3030 (GetField partitionBound (Ref __iruid_3029)) (StreamTakeWhile __iruid_3031 (StreamDropWhile __iruid_3032 (StreamFlatMap __iruid_3033 (ToStream True (GetField oldContexts (Ref __iruid_3029))) (StreamZip -1 AssertSameLength (__iruid_3034 __iruid_3035) (StreamMap __iruid_3036 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3033))) (Let __iruid_3037 (ToArray (StreamMap __iruid_3038 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3036))) (Let __iruid_3039 (InsertFields (SelectFields () (Ref __iruid_3038)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3038))))) (If (IsNA (Ref __iruid_3039)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3039))))) (Let __iruid_3040 (StreamAgg __iruid_3041 (StreamFilter __iruid_3042 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3037)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3037) (Ref __iruid_3042))))) (InsertFields (SelectFields () (Ref __iruid_3036)) None (__mean_gt (AggLet __iruid_3043 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3037) (Ref __iruid_3041)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3043))) (Let __iruid_3044 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3043)))))) (Cast Float64 (Ref __iruid_3044)))))))) (InsertFields (Ref __iruid_3040) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3045 (ToStream False (Ref __iruid_3037)) (InsertFields (SelectFields () (Ref __iruid_3045)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3045))) (GetField __mean_gt (Ref __iruid_3040)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3033)) (GetField mwStopIdx (Ref __iruid_3033)) (I32 1)) (InsertFields (Ref __iruid_3034) None (__iruid_2764 (Ref __iruid_3035))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3032)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3030)) (Apply -1 includesStart () Boolean (Ref __iruid_3030)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3031)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3030)) (Apply -1 includesEnd () Boolean (Ref __iruid_3030)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3028))) (rowOfData (ToArray (StreamMap __iruid_3046 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3028)) (Ref __iruid_3026) (ApplyBinaryPrimOp Add (Ref __iruid_3026) (GetField blockSize (Ref __iruid_3013))) (I32 1))) (GetField __uid_6 (Ref __iruid_3046)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3013))) (blockColIdx (GetField blockColIdx (Ref __iruid_3013))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3047 (ToStream False (Ref __iruid_3027)) (ToStream False (GetField rowOfData (Ref __iruid_3047)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3027))) (Cast Int64 (GetField blockSize (Ref __iruid_3013)))) (True))))))) (Let __iruid_3048 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3025)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3025)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3048) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3025)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3048))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3012))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3010)) (GetTupleElement 0 (Ref __iruid_3011))))) (GetTupleElement 1 (Ref __iruid_3009)))))) 2023-04-22 21:14:21.212 : INFO: after LowerAndExecuteShuffles: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3009 (ToStream False (ArraySort __iruid_3010 __iruid_3011 (StreamFlatMap __iruid_3012 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3013 __iruid_3014 (StreamFlatMap __iruid_3015 (StreamZip -1 AssertSameLength (__iruid_3016 __iruid_3017) (Let __iruid_3018 (ToArray (StreamZip -1 AssertSameLength (__iruid_3019 __iruid_3020 __iruid_3021) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3019)) (mwStartIdx (Cast Int32 (Ref __iruid_3020))) (mwStopIdx (Cast Int32 (Ref __iruid_3021)))))) (StreamMap __iruid_3022 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3022))) (oldContexts (ToArray (StreamMap __iruid_3023 (ToStream False (GetField parentPartitions (Ref __iruid_3022))) (ArrayRef -1 (Ref __iruid_3018) (Ref __iruid_3023)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3024 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3016)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3024) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3024) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3024)) (blockRowIdx (Ref __iruid_3017))))) (Ref __iruid_3015)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3025 (Let __iruid_3026 (GetField blockStart (Ref __iruid_3013)) (Let __iruid_3027 (ToArray (StreamMap __iruid_3028 (Let __iruid_3029 (GetField oldTableCtx (Ref __iruid_3013)) (Let __iruid_3030 (GetField partitionBound (Ref __iruid_3029)) (StreamTakeWhile __iruid_3031 (StreamDropWhile __iruid_3032 (StreamFlatMap __iruid_3033 (ToStream True (GetField oldContexts (Ref __iruid_3029))) (StreamZip -1 AssertSameLength (__iruid_3034 __iruid_3035) (StreamMap __iruid_3036 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3033))) (Let __iruid_3037 (ToArray (StreamMap __iruid_3038 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3036))) (Let __iruid_3039 (InsertFields (SelectFields () (Ref __iruid_3038)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3038))))) (If (IsNA (Ref __iruid_3039)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3039))))) (Let __iruid_3040 (StreamAgg __iruid_3041 (StreamFilter __iruid_3042 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3037)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3037) (Ref __iruid_3042))))) (InsertFields (SelectFields () (Ref __iruid_3036)) None (__mean_gt (AggLet __iruid_3043 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3037) (Ref __iruid_3041)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3043))) (Let __iruid_3044 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3043)))))) (Cast Float64 (Ref __iruid_3044)))))))) (InsertFields (Ref __iruid_3040) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3045 (ToStream False (Ref __iruid_3037)) (InsertFields (SelectFields () (Ref __iruid_3045)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3045))) (GetField __mean_gt (Ref __iruid_3040)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3033)) (GetField mwStopIdx (Ref __iruid_3033)) (I32 1)) (InsertFields (Ref __iruid_3034) None (__iruid_2764 (Ref __iruid_3035))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3032)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3030)) (Apply -1 includesStart () Boolean (Ref __iruid_3030)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3031)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3030)) (Apply -1 includesEnd () Boolean (Ref __iruid_3030)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3028))) (rowOfData (ToArray (StreamMap __iruid_3046 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3028)) (Ref __iruid_3026) (ApplyBinaryPrimOp Add (Ref __iruid_3026) (GetField blockSize (Ref __iruid_3013))) (I32 1))) (GetField __uid_6 (Ref __iruid_3046)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3013))) (blockColIdx (GetField blockColIdx (Ref __iruid_3013))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3047 (ToStream False (Ref __iruid_3027)) (ToStream False (GetField rowOfData (Ref __iruid_3047)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3027))) (Cast Int64 (GetField blockSize (Ref __iruid_3013)))) (True))))))) (Let __iruid_3048 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3025)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3025)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3048) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3025)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3048))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3012))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3010)) (GetTupleElement 0 (Ref __iruid_3011))))) (GetTupleElement 1 (Ref __iruid_3009)))))) 2023-04-22 21:14:21.333 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3129 (ToStream False (ArraySort __iruid_3130 __iruid_3131 (StreamFlatMap __iruid_3132 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3133 __iruid_3134 (StreamFlatMap __iruid_3135 (StreamZip -1 AssertSameLength (__iruid_3136 __iruid_3137) (Let __iruid_3138 (ToArray (StreamZip -1 AssertSameLength (__iruid_3139 __iruid_3140 __iruid_3141) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3139)) (mwStartIdx (Cast Int32 (Ref __iruid_3140))) (mwStopIdx (Cast Int32 (Ref __iruid_3141)))))) (StreamMap __iruid_3142 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3142))) (oldContexts (ToArray (StreamMap __iruid_3143 (ToStream False (GetField parentPartitions (Ref __iruid_3142))) (ArrayRef -1 (Ref __iruid_3138) (Ref __iruid_3143)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3144 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3136)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3144) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3144) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3144)) (blockRowIdx (Ref __iruid_3137))))) (Ref __iruid_3135)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3145 (Let __iruid_3146 (GetField blockStart (Ref __iruid_3133)) (Let __iruid_3147 (ToArray (StreamMap __iruid_3148 (Let __iruid_3149 (GetField oldTableCtx (Ref __iruid_3133)) (Let __iruid_3150 (GetField partitionBound (Ref __iruid_3149)) (StreamTakeWhile __iruid_3151 (StreamDropWhile __iruid_3152 (StreamFlatMap __iruid_3153 (ToStream True (GetField oldContexts (Ref __iruid_3149))) (StreamZip -1 AssertSameLength (__iruid_3154 __iruid_3155) (StreamMap __iruid_3156 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3153))) (Let __iruid_3157 (ToArray (StreamMap __iruid_3158 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3156))) (Let __iruid_3159 (InsertFields (SelectFields () (Ref __iruid_3158)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3158))))) (If (IsNA (Ref __iruid_3159)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3159))))) (Let __iruid_3160 (StreamAgg __iruid_3161 (StreamFilter __iruid_3162 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3157)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3157) (Ref __iruid_3162))))) (InsertFields (SelectFields () (Ref __iruid_3156)) None (__mean_gt (AggLet __iruid_3163 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3157) (Ref __iruid_3161)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3163))) (Let __iruid_3164 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3163)))))) (Cast Float64 (Ref __iruid_3164)))))))) (InsertFields (Ref __iruid_3160) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3165 (ToStream False (Ref __iruid_3157)) (InsertFields (SelectFields () (Ref __iruid_3165)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3165))) (GetField __mean_gt (Ref __iruid_3160)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3153)) (GetField mwStopIdx (Ref __iruid_3153)) (I32 1)) (InsertFields (Ref __iruid_3154) None (__iruid_2764 (Ref __iruid_3155))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3152)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3150)) (Apply -1 includesStart () Boolean (Ref __iruid_3150)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3151)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3150)) (Apply -1 includesEnd () Boolean (Ref __iruid_3150)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3148))) (rowOfData (ToArray (StreamMap __iruid_3166 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3148)) (Ref __iruid_3146) (ApplyBinaryPrimOp Add (Ref __iruid_3146) (GetField blockSize (Ref __iruid_3133))) (I32 1))) (GetField __uid_6 (Ref __iruid_3166)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3133))) (blockColIdx (GetField blockColIdx (Ref __iruid_3133))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3167 (ToStream False (Ref __iruid_3147)) (ToStream False (GetField rowOfData (Ref __iruid_3167)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3147))) (Cast Int64 (GetField blockSize (Ref __iruid_3133)))) (True))))))) (Let __iruid_3168 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3145)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3145)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3168) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3145)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3168))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3132))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3130)) (GetTupleElement 0 (Ref __iruid_3131))))) (GetTupleElement 1 (Ref __iruid_3129)))))) 2023-04-22 21:14:21.362 : INFO: after LowerOrInterpretNonCompilable: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3129 (ToStream False (ArraySort __iruid_3130 __iruid_3131 (StreamFlatMap __iruid_3132 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3133 __iruid_3134 (StreamFlatMap __iruid_3135 (StreamZip -1 AssertSameLength (__iruid_3136 __iruid_3137) (Let __iruid_3138 (ToArray (StreamZip -1 AssertSameLength (__iruid_3139 __iruid_3140 __iruid_3141) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3139)) (mwStartIdx (Cast Int32 (Ref __iruid_3140))) (mwStopIdx (Cast Int32 (Ref __iruid_3141)))))) (StreamMap __iruid_3142 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3142))) (oldContexts (ToArray (StreamMap __iruid_3143 (ToStream False (GetField parentPartitions (Ref __iruid_3142))) (ArrayRef -1 (Ref __iruid_3138) (Ref __iruid_3143)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3144 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3136)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3144) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3144) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3144)) (blockRowIdx (Ref __iruid_3137))))) (Ref __iruid_3135)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3145 (Let __iruid_3146 (GetField blockStart (Ref __iruid_3133)) (Let __iruid_3147 (ToArray (StreamMap __iruid_3148 (Let __iruid_3149 (GetField oldTableCtx (Ref __iruid_3133)) (Let __iruid_3150 (GetField partitionBound (Ref __iruid_3149)) (StreamTakeWhile __iruid_3151 (StreamDropWhile __iruid_3152 (StreamFlatMap __iruid_3153 (ToStream True (GetField oldContexts (Ref __iruid_3149))) (StreamZip -1 AssertSameLength (__iruid_3154 __iruid_3155) (StreamMap __iruid_3156 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3153))) (Let __iruid_3157 (ToArray (StreamMap __iruid_3158 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3156))) (Let __iruid_3159 (InsertFields (SelectFields () (Ref __iruid_3158)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3158))))) (If (IsNA (Ref __iruid_3159)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3159))))) (Let __iruid_3160 (StreamAgg __iruid_3161 (StreamFilter __iruid_3162 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3157)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3157) (Ref __iruid_3162))))) (InsertFields (SelectFields () (Ref __iruid_3156)) None (__mean_gt (AggLet __iruid_3163 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3157) (Ref __iruid_3161)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3163))) (Let __iruid_3164 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3163)))))) (Cast Float64 (Ref __iruid_3164)))))))) (InsertFields (Ref __iruid_3160) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3165 (ToStream False (Ref __iruid_3157)) (InsertFields (SelectFields () (Ref __iruid_3165)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3165))) (GetField __mean_gt (Ref __iruid_3160)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3153)) (GetField mwStopIdx (Ref __iruid_3153)) (I32 1)) (InsertFields (Ref __iruid_3154) None (__iruid_2764 (Ref __iruid_3155))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3152)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3150)) (Apply -1 includesStart () Boolean (Ref __iruid_3150)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3151)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3150)) (Apply -1 includesEnd () Boolean (Ref __iruid_3150)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3148))) (rowOfData (ToArray (StreamMap __iruid_3166 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3148)) (Ref __iruid_3146) (ApplyBinaryPrimOp Add (Ref __iruid_3146) (GetField blockSize (Ref __iruid_3133))) (I32 1))) (GetField __uid_6 (Ref __iruid_3166)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3133))) (blockColIdx (GetField blockColIdx (Ref __iruid_3133))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3167 (ToStream False (Ref __iruid_3147)) (ToStream False (GetField rowOfData (Ref __iruid_3167)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3147))) (Cast Int64 (GetField blockSize (Ref __iruid_3133)))) (True))))))) (Let __iruid_3168 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3145)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3145)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3168) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3145)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3168))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3132))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3130)) (GetTupleElement 0 (Ref __iruid_3131))))) (GetTupleElement 1 (Ref __iruid_3129)))))) 2023-04-22 21:14:21.459 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3249 (ToStream False (ArraySort __iruid_3250 __iruid_3251 (StreamFlatMap __iruid_3252 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3253 __iruid_3254 (StreamFlatMap __iruid_3255 (StreamZip -1 AssertSameLength (__iruid_3256 __iruid_3257) (Let __iruid_3258 (ToArray (StreamZip -1 AssertSameLength (__iruid_3259 __iruid_3260 __iruid_3261) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3259)) (mwStartIdx (Cast Int32 (Ref __iruid_3260))) (mwStopIdx (Cast Int32 (Ref __iruid_3261)))))) (StreamMap __iruid_3262 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3262))) (oldContexts (ToArray (StreamMap __iruid_3263 (ToStream False (GetField parentPartitions (Ref __iruid_3262))) (ArrayRef -1 (Ref __iruid_3258) (Ref __iruid_3263)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3264 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3256)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3264) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3264) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3264)) (blockRowIdx (Ref __iruid_3257))))) (Ref __iruid_3255)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3265 (Let __iruid_3266 (GetField blockStart (Ref __iruid_3253)) (Let __iruid_3267 (ToArray (StreamMap __iruid_3268 (Let __iruid_3269 (GetField oldTableCtx (Ref __iruid_3253)) (Let __iruid_3270 (GetField partitionBound (Ref __iruid_3269)) (StreamTakeWhile __iruid_3271 (StreamDropWhile __iruid_3272 (StreamFlatMap __iruid_3273 (ToStream True (GetField oldContexts (Ref __iruid_3269))) (StreamZip -1 AssertSameLength (__iruid_3274 __iruid_3275) (StreamMap __iruid_3276 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3273))) (Let __iruid_3277 (ToArray (StreamMap __iruid_3278 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3276))) (Let __iruid_3279 (InsertFields (SelectFields () (Ref __iruid_3278)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3278))))) (If (IsNA (Ref __iruid_3279)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3279))))) (Let __iruid_3280 (StreamAgg __iruid_3281 (StreamFilter __iruid_3282 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3277)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3277) (Ref __iruid_3282))))) (InsertFields (SelectFields () (Ref __iruid_3276)) None (__mean_gt (AggLet __iruid_3283 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3277) (Ref __iruid_3281)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3283))) (Let __iruid_3284 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3283)))))) (Cast Float64 (Ref __iruid_3284)))))))) (InsertFields (Ref __iruid_3280) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3285 (ToStream False (Ref __iruid_3277)) (InsertFields (SelectFields () (Ref __iruid_3285)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3285))) (GetField __mean_gt (Ref __iruid_3280)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3273)) (GetField mwStopIdx (Ref __iruid_3273)) (I32 1)) (InsertFields (Ref __iruid_3274) None (__iruid_2764 (Ref __iruid_3275))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3272)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3270)) (Apply -1 includesStart () Boolean (Ref __iruid_3270)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3271)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3270)) (Apply -1 includesEnd () Boolean (Ref __iruid_3270)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3268))) (rowOfData (ToArray (StreamMap __iruid_3286 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3268)) (Ref __iruid_3266) (ApplyBinaryPrimOp Add (Ref __iruid_3266) (GetField blockSize (Ref __iruid_3253))) (I32 1))) (GetField __uid_6 (Ref __iruid_3286)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3253))) (blockColIdx (GetField blockColIdx (Ref __iruid_3253))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3287 (ToStream False (Ref __iruid_3267)) (ToStream False (GetField rowOfData (Ref __iruid_3287)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3267))) (Cast Int64 (GetField blockSize (Ref __iruid_3253)))) (True))))))) (Let __iruid_3288 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3265)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3265)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3288) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3265)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3288))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3252))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3250)) (GetTupleElement 0 (Ref __iruid_3251))))) (GetTupleElement 1 (Ref __iruid_3249)))))) 2023-04-22 21:14:21.480 : INFO: initial IR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3249 (ToStream False (ArraySort __iruid_3250 __iruid_3251 (StreamFlatMap __iruid_3252 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3253 __iruid_3254 (StreamFlatMap __iruid_3255 (StreamZip -1 AssertSameLength (__iruid_3256 __iruid_3257) (Let __iruid_3258 (ToArray (StreamZip -1 AssertSameLength (__iruid_3259 __iruid_3260 __iruid_3261) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3259)) (mwStartIdx (Cast Int32 (Ref __iruid_3260))) (mwStopIdx (Cast Int32 (Ref __iruid_3261)))))) (StreamMap __iruid_3262 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3262))) (oldContexts (ToArray (StreamMap __iruid_3263 (ToStream False (GetField parentPartitions (Ref __iruid_3262))) (ArrayRef -1 (Ref __iruid_3258) (Ref __iruid_3263)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3264 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3256)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3264) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3264) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3264)) (blockRowIdx (Ref __iruid_3257))))) (Ref __iruid_3255)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3265 (Let __iruid_3266 (GetField blockStart (Ref __iruid_3253)) (Let __iruid_3267 (ToArray (StreamMap __iruid_3268 (Let __iruid_3269 (GetField oldTableCtx (Ref __iruid_3253)) (Let __iruid_3270 (GetField partitionBound (Ref __iruid_3269)) (StreamTakeWhile __iruid_3271 (StreamDropWhile __iruid_3272 (StreamFlatMap __iruid_3273 (ToStream True (GetField oldContexts (Ref __iruid_3269))) (StreamZip -1 AssertSameLength (__iruid_3274 __iruid_3275) (StreamMap __iruid_3276 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3273))) (Let __iruid_3277 (ToArray (StreamMap __iruid_3278 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3276))) (Let __iruid_3279 (InsertFields (SelectFields () (Ref __iruid_3278)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3278))))) (If (IsNA (Ref __iruid_3279)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3279))))) (Let __iruid_3280 (StreamAgg __iruid_3281 (StreamFilter __iruid_3282 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3277)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3277) (Ref __iruid_3282))))) (InsertFields (SelectFields () (Ref __iruid_3276)) None (__mean_gt (AggLet __iruid_3283 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3277) (Ref __iruid_3281)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3283))) (Let __iruid_3284 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3283)))))) (Cast Float64 (Ref __iruid_3284)))))))) (InsertFields (Ref __iruid_3280) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3285 (ToStream False (Ref __iruid_3277)) (InsertFields (SelectFields () (Ref __iruid_3285)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3285))) (GetField __mean_gt (Ref __iruid_3280)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3273)) (GetField mwStopIdx (Ref __iruid_3273)) (I32 1)) (InsertFields (Ref __iruid_3274) None (__iruid_2764 (Ref __iruid_3275))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3272)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3270)) (Apply -1 includesStart () Boolean (Ref __iruid_3270)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3271)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3270)) (Apply -1 includesEnd () Boolean (Ref __iruid_3270)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3268))) (rowOfData (ToArray (StreamMap __iruid_3286 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3268)) (Ref __iruid_3266) (ApplyBinaryPrimOp Add (Ref __iruid_3266) (GetField blockSize (Ref __iruid_3253))) (I32 1))) (GetField __uid_6 (Ref __iruid_3286)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3253))) (blockColIdx (GetField blockColIdx (Ref __iruid_3253))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3287 (ToStream False (Ref __iruid_3267)) (ToStream False (GetField rowOfData (Ref __iruid_3287)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3267))) (Cast Int64 (GetField blockSize (Ref __iruid_3253)))) (True))))))) (Let __iruid_3288 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3265)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3265)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3288) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3265)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3288))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3252))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3250)) (GetTupleElement 0 (Ref __iruid_3251))))) (GetTupleElement 1 (Ref __iruid_3249)))))) 2023-04-22 21:14:21.569 : INFO: after optimize: compileLowerer, initial IR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3369 (ToStream False (ArraySort __iruid_3370 __iruid_3371 (StreamFlatMap __iruid_3372 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3373 __iruid_3374 (StreamFlatMap __iruid_3375 (StreamZip -1 AssertSameLength (__iruid_3376 __iruid_3377) (Let __iruid_3378 (ToArray (StreamZip -1 AssertSameLength (__iruid_3379 __iruid_3380 __iruid_3381) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3379)) (mwStartIdx (Cast Int32 (Ref __iruid_3380))) (mwStopIdx (Cast Int32 (Ref __iruid_3381)))))) (StreamMap __iruid_3382 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3382))) (oldContexts (ToArray (StreamMap __iruid_3383 (ToStream False (GetField parentPartitions (Ref __iruid_3382))) (ArrayRef -1 (Ref __iruid_3378) (Ref __iruid_3383)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3384 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3376)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3384) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3384) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3384)) (blockRowIdx (Ref __iruid_3377))))) (Ref __iruid_3375)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3385 (Let __iruid_3386 (GetField blockStart (Ref __iruid_3373)) (Let __iruid_3387 (ToArray (StreamMap __iruid_3388 (Let __iruid_3389 (GetField oldTableCtx (Ref __iruid_3373)) (Let __iruid_3390 (GetField partitionBound (Ref __iruid_3389)) (StreamTakeWhile __iruid_3391 (StreamDropWhile __iruid_3392 (StreamFlatMap __iruid_3393 (ToStream True (GetField oldContexts (Ref __iruid_3389))) (StreamZip -1 AssertSameLength (__iruid_3394 __iruid_3395) (StreamMap __iruid_3396 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3393))) (Let __iruid_3397 (ToArray (StreamMap __iruid_3398 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3396))) (Let __iruid_3399 (InsertFields (SelectFields () (Ref __iruid_3398)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3398))))) (If (IsNA (Ref __iruid_3399)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3399))))) (Let __iruid_3400 (StreamAgg __iruid_3401 (StreamFilter __iruid_3402 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3397)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3397) (Ref __iruid_3402))))) (InsertFields (SelectFields () (Ref __iruid_3396)) None (__mean_gt (AggLet __iruid_3403 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3397) (Ref __iruid_3401)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3403))) (Let __iruid_3404 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3403)))))) (Cast Float64 (Ref __iruid_3404)))))))) (InsertFields (Ref __iruid_3400) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3405 (ToStream False (Ref __iruid_3397)) (InsertFields (SelectFields () (Ref __iruid_3405)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3405))) (GetField __mean_gt (Ref __iruid_3400)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3393)) (GetField mwStopIdx (Ref __iruid_3393)) (I32 1)) (InsertFields (Ref __iruid_3394) None (__iruid_2764 (Ref __iruid_3395))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3392)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3390)) (Apply -1 includesStart () Boolean (Ref __iruid_3390)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3391)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3390)) (Apply -1 includesEnd () Boolean (Ref __iruid_3390)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3388))) (rowOfData (ToArray (StreamMap __iruid_3406 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3388)) (Ref __iruid_3386) (ApplyBinaryPrimOp Add (Ref __iruid_3386) (GetField blockSize (Ref __iruid_3373))) (I32 1))) (GetField __uid_6 (Ref __iruid_3406)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3373))) (blockColIdx (GetField blockColIdx (Ref __iruid_3373))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3407 (ToStream False (Ref __iruid_3387)) (ToStream False (GetField rowOfData (Ref __iruid_3407)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3387))) (Cast Int64 (GetField blockSize (Ref __iruid_3373)))) (True))))))) (Let __iruid_3408 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3385)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3385)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3408) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3385)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3408))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3372))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3370)) (GetTupleElement 0 (Ref __iruid_3371))))) (GetTupleElement 1 (Ref __iruid_3369)))))) 2023-04-22 21:14:21.600 : INFO: after InlineApplyIR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3369 (ToStream False (ArraySort __iruid_3370 __iruid_3371 (StreamFlatMap __iruid_3372 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3373 __iruid_3374 (StreamFlatMap __iruid_3375 (StreamZip -1 AssertSameLength (__iruid_3376 __iruid_3377) (Let __iruid_3378 (ToArray (StreamZip -1 AssertSameLength (__iruid_3379 __iruid_3380 __iruid_3381) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3379)) (mwStartIdx (Cast Int32 (Ref __iruid_3380))) (mwStopIdx (Cast Int32 (Ref __iruid_3381)))))) (StreamMap __iruid_3382 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3382))) (oldContexts (ToArray (StreamMap __iruid_3383 (ToStream False (GetField parentPartitions (Ref __iruid_3382))) (ArrayRef -1 (Ref __iruid_3378) (Ref __iruid_3383)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3384 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3376)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3384) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3384) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3384)) (blockRowIdx (Ref __iruid_3377))))) (Ref __iruid_3375)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3385 (Let __iruid_3386 (GetField blockStart (Ref __iruid_3373)) (Let __iruid_3387 (ToArray (StreamMap __iruid_3388 (Let __iruid_3389 (GetField oldTableCtx (Ref __iruid_3373)) (Let __iruid_3390 (GetField partitionBound (Ref __iruid_3389)) (StreamTakeWhile __iruid_3391 (StreamDropWhile __iruid_3392 (StreamFlatMap __iruid_3393 (ToStream True (GetField oldContexts (Ref __iruid_3389))) (StreamZip -1 AssertSameLength (__iruid_3394 __iruid_3395) (StreamMap __iruid_3396 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3393))) (Let __iruid_3397 (ToArray (StreamMap __iruid_3398 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3396))) (Let __iruid_3399 (InsertFields (SelectFields () (Ref __iruid_3398)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3398))))) (If (IsNA (Ref __iruid_3399)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3399))))) (Let __iruid_3400 (StreamAgg __iruid_3401 (StreamFilter __iruid_3402 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3397)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3397) (Ref __iruid_3402))))) (InsertFields (SelectFields () (Ref __iruid_3396)) None (__mean_gt (AggLet __iruid_3403 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3397) (Ref __iruid_3401)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3403))) (Let __iruid_3404 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3403)))))) (Cast Float64 (Ref __iruid_3404)))))))) (InsertFields (Ref __iruid_3400) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3405 (ToStream False (Ref __iruid_3397)) (InsertFields (SelectFields () (Ref __iruid_3405)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3405))) (GetField __mean_gt (Ref __iruid_3400)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3393)) (GetField mwStopIdx (Ref __iruid_3393)) (I32 1)) (InsertFields (Ref __iruid_3394) None (__iruid_2764 (Ref __iruid_3395))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3392)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3390)) (Apply -1 includesStart () Boolean (Ref __iruid_3390)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3391)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3390)) (Apply -1 includesEnd () Boolean (Ref __iruid_3390)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3388))) (rowOfData (ToArray (StreamMap __iruid_3406 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3388)) (Ref __iruid_3386) (ApplyBinaryPrimOp Add (Ref __iruid_3386) (GetField blockSize (Ref __iruid_3373))) (I32 1))) (GetField __uid_6 (Ref __iruid_3406)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3373))) (blockColIdx (GetField blockColIdx (Ref __iruid_3373))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3407 (ToStream False (Ref __iruid_3387)) (ToStream False (GetField rowOfData (Ref __iruid_3407)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3387))) (Cast Int64 (GetField blockSize (Ref __iruid_3373)))) (True))))))) (Let __iruid_3408 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3385)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3385)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3408) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3385)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3408))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3372))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3370)) (GetTupleElement 0 (Ref __iruid_3371))))) (GetTupleElement 1 (Ref __iruid_3369)))))) 2023-04-22 21:14:21.700 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3489 (ToStream False (ArraySort __iruid_3490 __iruid_3491 (StreamFlatMap __iruid_3492 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3493 __iruid_3494 (StreamFlatMap __iruid_3495 (StreamZip -1 AssertSameLength (__iruid_3496 __iruid_3497) (Let __iruid_3498 (ToArray (StreamZip -1 AssertSameLength (__iruid_3499 __iruid_3500 __iruid_3501) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3499)) (mwStartIdx (Cast Int32 (Ref __iruid_3500))) (mwStopIdx (Cast Int32 (Ref __iruid_3501)))))) (StreamMap __iruid_3502 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3502))) (oldContexts (ToArray (StreamMap __iruid_3503 (ToStream False (GetField parentPartitions (Ref __iruid_3502))) (ArrayRef -1 (Ref __iruid_3498) (Ref __iruid_3503)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3504 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3496)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3504) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3504) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3504)) (blockRowIdx (Ref __iruid_3497))))) (Ref __iruid_3495)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3505 (Let __iruid_3506 (GetField blockStart (Ref __iruid_3493)) (Let __iruid_3507 (ToArray (StreamMap __iruid_3508 (Let __iruid_3509 (GetField oldTableCtx (Ref __iruid_3493)) (Let __iruid_3510 (GetField partitionBound (Ref __iruid_3509)) (StreamTakeWhile __iruid_3511 (StreamDropWhile __iruid_3512 (StreamFlatMap __iruid_3513 (ToStream True (GetField oldContexts (Ref __iruid_3509))) (StreamZip -1 AssertSameLength (__iruid_3514 __iruid_3515) (StreamMap __iruid_3516 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3513))) (Let __iruid_3517 (ToArray (StreamMap __iruid_3518 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3516))) (Let __iruid_3519 (InsertFields (SelectFields () (Ref __iruid_3518)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3518))))) (If (IsNA (Ref __iruid_3519)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3519))))) (Let __iruid_3520 (StreamAgg __iruid_3521 (StreamFilter __iruid_3522 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3517)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3517) (Ref __iruid_3522))))) (InsertFields (SelectFields () (Ref __iruid_3516)) None (__mean_gt (AggLet __iruid_3523 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3517) (Ref __iruid_3521)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3523))) (Let __iruid_3524 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3523)))))) (Cast Float64 (Ref __iruid_3524)))))))) (InsertFields (Ref __iruid_3520) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3525 (ToStream False (Ref __iruid_3517)) (InsertFields (SelectFields () (Ref __iruid_3525)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3525))) (GetField __mean_gt (Ref __iruid_3520)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3513)) (GetField mwStopIdx (Ref __iruid_3513)) (I32 1)) (InsertFields (Ref __iruid_3514) None (__iruid_2764 (Ref __iruid_3515))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3512)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3510)) (Apply -1 includesStart () Boolean (Ref __iruid_3510)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3511)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3510)) (Apply -1 includesEnd () Boolean (Ref __iruid_3510)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3508))) (rowOfData (ToArray (StreamMap __iruid_3526 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3508)) (Ref __iruid_3506) (ApplyBinaryPrimOp Add (Ref __iruid_3506) (GetField blockSize (Ref __iruid_3493))) (I32 1))) (GetField __uid_6 (Ref __iruid_3526)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3493))) (blockColIdx (GetField blockColIdx (Ref __iruid_3493))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3527 (ToStream False (Ref __iruid_3507)) (ToStream False (GetField rowOfData (Ref __iruid_3527)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3507))) (Cast Int64 (GetField blockSize (Ref __iruid_3493)))) (True))))))) (Let __iruid_3528 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3505)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3505)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3528) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3505)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3528))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3492))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3490)) (GetTupleElement 0 (Ref __iruid_3491))))) (GetTupleElement 1 (Ref __iruid_3489)))))) 2023-04-22 21:14:21.748 : INFO: after LowerArrayAggsToRunAggs: IR size 261: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3489 (ToStream False (ArraySort __iruid_3490 __iruid_3491 (StreamFlatMap __iruid_3492 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3493 __iruid_3494 (StreamFlatMap __iruid_3495 (StreamZip -1 AssertSameLength (__iruid_3496 __iruid_3497) (Let __iruid_3498 (ToArray (StreamZip -1 AssertSameLength (__iruid_3499 __iruid_3500 __iruid_3501) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3499)) (mwStartIdx (Cast Int32 (Ref __iruid_3500))) (mwStopIdx (Cast Int32 (Ref __iruid_3501)))))) (StreamMap __iruid_3502 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3502))) (oldContexts (ToArray (StreamMap __iruid_3503 (ToStream False (GetField parentPartitions (Ref __iruid_3502))) (ArrayRef -1 (Ref __iruid_3498) (Ref __iruid_3503)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3504 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3496)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3504) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3504) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3504)) (blockRowIdx (Ref __iruid_3497))))) (Ref __iruid_3495)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3505 (Let __iruid_3506 (GetField blockStart (Ref __iruid_3493)) (Let __iruid_3507 (ToArray (StreamMap __iruid_3508 (Let __iruid_3509 (GetField oldTableCtx (Ref __iruid_3493)) (Let __iruid_3510 (GetField partitionBound (Ref __iruid_3509)) (StreamTakeWhile __iruid_3511 (StreamDropWhile __iruid_3512 (StreamFlatMap __iruid_3513 (ToStream True (GetField oldContexts (Ref __iruid_3509))) (StreamZip -1 AssertSameLength (__iruid_3514 __iruid_3515) (StreamMap __iruid_3516 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3513))) (Let __iruid_3517 (ToArray (StreamMap __iruid_3518 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3516))) (Let __iruid_3519 (InsertFields (SelectFields () (Ref __iruid_3518)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3518))))) (If (IsNA (Ref __iruid_3519)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3519))))) (Let __iruid_3520 (Let __iruid_3529 (RunAgg ((TypedStateSig +PFloat64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PFloat64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_3521 (StreamFilter __iruid_3522 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3517)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3517) (Ref __iruid_3522))))) (Let __iruid_3523 (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3517) (Ref __iruid_3521)))) (Begin (SeqOp 0 (Sum (TypedStateSig +PFloat64)) ((Ref __iruid_3523))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3523)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PFloat64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields () (Ref __iruid_3516)) None (__mean_gt (ApplyBinaryPrimOp FloatingPointDivide (GetTupleElement 0 (Ref __iruid_3529)) (Let __iruid_3524 (GetTupleElement 1 (Ref __iruid_3529)) (Cast Float64 (Ref __iruid_3524))))))) (InsertFields (Ref __iruid_3520) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3525 (ToStream False (Ref __iruid_3517)) (InsertFields (SelectFields () (Ref __iruid_3525)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3525))) (GetField __mean_gt (Ref __iruid_3520)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3513)) (GetField mwStopIdx (Ref __iruid_3513)) (I32 1)) (InsertFields (Ref __iruid_3514) None (__iruid_2764 (Ref __iruid_3515))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3512)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3510)) (Apply -1 includesStart () Boolean (Ref __iruid_3510)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3511)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3510)) (Apply -1 includesEnd () Boolean (Ref __iruid_3510)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3508))) (rowOfData (ToArray (StreamMap __iruid_3526 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3508)) (Ref __iruid_3506) (ApplyBinaryPrimOp Add (Ref __iruid_3506) (GetField blockSize (Ref __iruid_3493))) (I32 1))) (GetField __uid_6 (Ref __iruid_3526)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3493))) (blockColIdx (GetField blockColIdx (Ref __iruid_3493))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3527 (ToStream False (Ref __iruid_3507)) (ToStream False (GetField rowOfData (Ref __iruid_3527)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3507))) (Cast Int64 (GetField blockSize (Ref __iruid_3493)))) (True))))))) (Let __iruid_3528 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3505)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3505)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3528) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3505)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3528))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3492))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3490)) (GetTupleElement 0 (Ref __iruid_3491))))) (GetTupleElement 1 (Ref __iruid_3489)))))) 2023-04-22 21:14:21.802 : INFO: Prune: InsertFields: eliminating field '__mean_gt' 2023-04-22 21:14:21.841 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 256: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3615 (ToStream False (ArraySort __iruid_3616 __iruid_3617 (StreamFlatMap __iruid_3618 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3619 __iruid_3620 (StreamFlatMap __iruid_3621 (StreamZip -1 AssertSameLength (__iruid_3622 __iruid_3623) (Let __iruid_3624 (ToArray (StreamZip -1 AssertSameLength (__iruid_3625 __iruid_3626 __iruid_3627) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3625)) (mwStartIdx (Cast Int32 (Ref __iruid_3626))) (mwStopIdx (Cast Int32 (Ref __iruid_3627)))))) (StreamMap __iruid_3628 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3628))) (oldContexts (ToArray (StreamMap __iruid_3629 (ToStream False (GetField parentPartitions (Ref __iruid_3628))) (ArrayRef -1 (Ref __iruid_3624) (Ref __iruid_3629)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3630 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3622)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3630) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3630) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3630)) (blockRowIdx (Ref __iruid_3623))))) (Ref __iruid_3621)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3631 (Let __iruid_3632 (GetField blockStart (Ref __iruid_3619)) (Let __iruid_3633 (ToArray (StreamMap __iruid_3634 (Let __iruid_3635 (GetField oldTableCtx (Ref __iruid_3619)) (Let __iruid_3636 (GetField partitionBound (Ref __iruid_3635)) (StreamTakeWhile __iruid_3637 (StreamDropWhile __iruid_3638 (StreamFlatMap __iruid_3639 (ToStream True (GetField oldContexts (Ref __iruid_3635))) (StreamZip -1 AssertSameLength (__iruid_3640 __iruid_3641) (StreamMap __iruid_3642 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3639))) (Let __iruid_3643 (ToArray (StreamMap __iruid_3644 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3642))) (Let __iruid_3645 (InsertFields (SelectFields () (Ref __iruid_3644)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3644))))) (If (IsNA (Ref __iruid_3645)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3645))))) (Let __iruid_3646 (RunAgg ((TypedStateSig +PFloat64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PFloat64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_3647 (StreamFilter __iruid_3648 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3643)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3643) (Ref __iruid_3648))))) (Let __iruid_3649 (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3643) (Ref __iruid_3647)))) (Begin (SeqOp 0 (Sum (TypedStateSig +PFloat64)) ((Ref __iruid_3649))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3649)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PFloat64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (Let __iruid_3650 (ApplyBinaryPrimOp FloatingPointDivide (GetTupleElement 0 (Ref __iruid_3646)) (Cast Float64 (GetTupleElement 1 (Ref __iruid_3646)))) (InsertFields (SelectFields () (Ref __iruid_3642)) ( "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3651 (ToStream False (Ref __iruid_3643)) (InsertFields (SelectFields () (Ref __iruid_3651)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3651))) (Ref __iruid_3650)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3639)) (GetField mwStopIdx (Ref __iruid_3639)) (I32 1)) (InsertFields (Ref __iruid_3640) None (__iruid_2764 (Ref __iruid_3641))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3638)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3636)) (Apply -1 includesStart () Boolean (Ref __iruid_3636)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3637)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3636)) (Apply -1 includesEnd () Boolean (Ref __iruid_3636)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3634))) (rowOfData (ToArray (StreamMap __iruid_3652 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3634)) (Ref __iruid_3632) (ApplyBinaryPrimOp Add (Ref __iruid_3632) (GetField blockSize (Ref __iruid_3619))) (I32 1))) (GetField __uid_6 (Ref __iruid_3652)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3619))) (blockColIdx (GetField blockColIdx (Ref __iruid_3619))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3653 (ToStream False (Ref __iruid_3633)) (ToStream False (GetField rowOfData (Ref __iruid_3653)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3633))) (Cast Int64 (GetField blockSize (Ref __iruid_3619)))) (True))))))) (Let __iruid_3654 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3631)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3631)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3654) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3631)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3654))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3618))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3616)) (GetTupleElement 0 (Ref __iruid_3617))))) (GetTupleElement 1 (Ref __iruid_3615)))))) 2023-04-22 21:14:22.154 : INFO: encoder cache miss (43 hits, 20 misses, 0.683) 2023-04-22 21:14:22.156 : INFO: instruction count: 3: __C2924HailClassLoaderContainer. 2023-04-22 21:14:22.156 : INFO: instruction count: 3: __C2924HailClassLoaderContainer. 2023-04-22 21:14:22.156 : INFO: instruction count: 3: __C2926FSContainer. 2023-04-22 21:14:22.156 : INFO: instruction count: 3: __C2926FSContainer. 2023-04-22 21:14:22.157 : INFO: instruction count: 3: __C2928etypeEncode. 2023-04-22 21:14:22.157 : INFO: instruction count: 7: __C2928etypeEncode.apply 2023-04-22 21:14:22.157 : INFO: instruction count: 33: __C2928etypeEncode.__m2930ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_struct_of_o_int32ENDANDr_binaryEND 2023-04-22 21:14:22.157 : INFO: instruction count: 16: __C2928etypeEncode.__m2931ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:14:22.157 : INFO: instruction count: 36: __C2928etypeEncode.__m2932ENCODE_SBaseStructPointer_TO_r_struct_of_o_int32END 2023-04-22 21:14:22.157 : INFO: instruction count: 4: __C2928etypeEncode.__m2933ENCODE_SInt32$_TO_o_int32 2023-04-22 21:14:22.159 MemoryStore: INFO: Block broadcast_187 stored as values in memory (estimated size 200.0 B, free 25.0 GiB) 2023-04-22 21:14:22.164 MemoryStore: INFO: Block broadcast_187_piece0 stored as bytes in memory (estimated size 155.0 B, free 25.0 GiB) 2023-04-22 21:14:22.164 BlockManagerInfo: INFO: Added broadcast_187_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 155.0 B, free: 25.3 GiB) 2023-04-22 21:14:22.165 SparkContext: INFO: Created broadcast 187 from broadcast at SparkBackend.scala:354 2023-04-22 21:14:22.165 : INFO: instruction count: 3: __C2714HailClassLoaderContainer. 2023-04-22 21:14:22.165 : INFO: instruction count: 3: __C2714HailClassLoaderContainer. 2023-04-22 21:14:22.165 : INFO: instruction count: 3: __C2716FSContainer. 2023-04-22 21:14:22.165 : INFO: instruction count: 3: __C2716FSContainer. 2023-04-22 21:14:22.222 : INFO: instruction count: 3: __C2718collect_distributed_array_matrix_block_matrix_writer. 2023-04-22 21:14:22.222 : INFO: instruction count: 111: __C2718collect_distributed_array_matrix_block_matrix_writer.apply 2023-04-22 21:14:22.222 : INFO: instruction count: 17: __C2718collect_distributed_array_matrix_block_matrix_writer.apply 2023-04-22 21:14:22.222 : INFO: instruction count: 27: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2720DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:14:22.222 : INFO: instruction count: 53: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2721INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:22.222 : INFO: instruction count: 26: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2722INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDEND_TO_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDEND 2023-04-22 21:14:22.222 : INFO: instruction count: 44: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2723INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolEND_TO_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:22.222 : INFO: instruction count: 26: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2724INPLACE_DECODE_r_struct_of_r_struct_of_r_int32ENDANDr_int32END_TO_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:22.222 : INFO: instruction count: 17: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2725INPLACE_DECODE_r_struct_of_r_int32END_TO_r_struct_of_r_int32END 2023-04-22 21:14:22.222 : INFO: instruction count: 10: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2726INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:14:22.222 : INFO: instruction count: 10: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2727INPLACE_DECODE_r_bool_TO_r_bool 2023-04-22 21:14:22.222 : INFO: instruction count: 58: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2728INPLACE_DECODE_r_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:22.222 : INFO: instruction count: 35: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2729INPLACE_DECODE_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END_TO_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:22.222 : INFO: instruction count: 44: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2730INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:22.222 : INFO: instruction count: 31: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2731INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:14:22.223 : INFO: instruction count: 27: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2733DECODE_r_struct_of_r_struct_of_ENDEND_TO_SBaseStructPointer 2023-04-22 21:14:22.223 : INFO: instruction count: 8: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2734INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:14:22.223 : INFO: instruction count: 483: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2736split_ToArray 2023-04-22 21:14:22.224 : INFO: instruction count: 699: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2739split_ToArray 2023-04-22 21:14:22.224 : INFO: instruction count: 256: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2760split_ToArray 2023-04-22 21:14:22.224 : INFO: instruction count: 8: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2768nNonRefAlleles 2023-04-22 21:14:22.224 : INFO: instruction count: 9: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2783begin_group_0 2023-04-22 21:14:22.224 : INFO: instruction count: 17: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2784begin_group_0 2023-04-22 21:14:22.224 : INFO: instruction count: 158: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2785split_StreamFor 2023-04-22 21:14:22.224 : INFO: instruction count: 35: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2793arrayref_bounds_check 2023-04-22 21:14:22.224 : INFO: instruction count: 70: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2797begin_group_0 2023-04-22 21:14:22.224 : INFO: instruction count: 5: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2800toInt64 2023-04-22 21:14:22.225 : INFO: instruction count: 170: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2807split_ToArray 2023-04-22 21:14:22.225 : INFO: instruction count: 2: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2841includesStart 2023-04-22 21:14:22.225 : INFO: instruction count: 111: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2842pointLessThanPartitionIntervalLeftEndpoint 2023-04-22 21:14:22.225 : INFO: instruction count: 11: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2843ord_compare 2023-04-22 21:14:22.225 : INFO: instruction count: 8: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2844ord_compareNonnull 2023-04-22 21:14:22.225 : INFO: instruction count: 2: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2850includesEnd 2023-04-22 21:14:22.225 : INFO: instruction count: 102: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2851pointLessThanPartitionIntervalRightEndpoint 2023-04-22 21:14:22.225 : INFO: instruction count: 364: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2857split_ToArray 2023-04-22 21:14:22.225 : INFO: instruction count: 12: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2870setup_jab 2023-04-22 21:14:22.226 : INFO: instruction count: 18: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2901str 2023-04-22 21:14:22.226 : INFO: instruction count: 29: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2902concat 2023-04-22 21:14:22.226 : INFO: instruction count: 29: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2903concat 2023-04-22 21:14:22.241 : INFO: instruction count: 68: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2908ENCODE_SNDArrayPointer_TO_r_ndarray_of_o_float64 2023-04-22 21:14:22.241 : INFO: instruction count: 4: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2909ENCODE_SFloat64$_TO_o_float64 2023-04-22 21:14:22.241 : INFO: instruction count: 25: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2913ENCODE_SBaseStructPointer_TO_r_struct_of_r_array_of_r_struct_of_r_int32ANDr_binaryENDEND 2023-04-22 21:14:22.241 : INFO: instruction count: 35: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2914ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_int32ANDr_binaryEND 2023-04-22 21:14:22.241 : INFO: instruction count: 25: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2915ENCODE_SBaseStructPointer_TO_r_struct_of_r_int32ANDr_binaryEND 2023-04-22 21:14:22.241 : INFO: instruction count: 4: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2916ENCODE_SInt32$_TO_r_int32 2023-04-22 21:14:22.241 : INFO: instruction count: 16: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2917ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:14:22.241 : INFO: instruction count: 9: __C2718collect_distributed_array_matrix_block_matrix_writer.setPartitionIndex 2023-04-22 21:14:22.241 : INFO: instruction count: 4: __C2718collect_distributed_array_matrix_block_matrix_writer.addPartitionRegion 2023-04-22 21:14:22.241 : INFO: instruction count: 4: __C2718collect_distributed_array_matrix_block_matrix_writer.setPool 2023-04-22 21:14:22.241 : INFO: instruction count: 3: __C2718collect_distributed_array_matrix_block_matrix_writer.addHailClassLoader 2023-04-22 21:14:22.241 : INFO: instruction count: 3: __C2718collect_distributed_array_matrix_block_matrix_writer.addFS 2023-04-22 21:14:22.241 : INFO: instruction count: 4: __C2718collect_distributed_array_matrix_block_matrix_writer.addTaskContext 2023-04-22 21:14:22.241 : INFO: instruction count: 3: __C2718collect_distributed_array_matrix_block_matrix_writer.setObjects 2023-04-22 21:14:22.241 : INFO: instruction count: 67: __C2718collect_distributed_array_matrix_block_matrix_writer.addAndDecodeLiterals 2023-04-22 21:14:22.241 : INFO: instruction count: 45: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2921DECODE_r_struct_of_r_binaryANDr_struct_of_o_int32ENDANDr_binaryEND_TO_SBaseStructPointer 2023-04-22 21:14:22.241 : INFO: instruction count: 48: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2922INPLACE_DECODE_r_struct_of_o_int32END_TO_r_struct_of_o_int32END 2023-04-22 21:14:22.241 : INFO: instruction count: 10: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2923INPLACE_DECODE_o_int32_TO_o_int32 2023-04-22 21:14:22.242 : INFO: instruction count: 384: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2736split_ToArray_region20_22 2023-04-22 21:14:22.242 : INFO: instruction count: 216: __C2718collect_distributed_array_matrix_block_matrix_writer.__m2739split_ToArray_region7_14 2023-04-22 21:14:22.242 : INFO: instruction count: 3: __C2934__m2736split_ToArraySpills. 2023-04-22 21:14:22.242 : INFO: instruction count: 3: __C2948__m2739split_ToArraySpills. 2023-04-22 21:14:22.242 : INFO: instruction count: 3: __C2833staticWrapperClass_1. 2023-04-22 21:14:22.546 : INFO: encoder cache miss (43 hits, 21 misses, 0.672) 2023-04-22 21:14:22.562 : INFO: instruction count: 3: __C3031HailClassLoaderContainer. 2023-04-22 21:14:22.562 : INFO: instruction count: 3: __C3031HailClassLoaderContainer. 2023-04-22 21:14:22.562 : INFO: instruction count: 3: __C3033FSContainer. 2023-04-22 21:14:22.562 : INFO: instruction count: 3: __C3033FSContainer. 2023-04-22 21:14:22.564 : INFO: instruction count: 3: __C3035etypeEncode. 2023-04-22 21:14:22.564 : INFO: instruction count: 7: __C3035etypeEncode.apply 2023-04-22 21:14:22.564 : INFO: instruction count: 105: __C3035etypeEncode.__m3037ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_ENDANDr_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_array_of_r_int64ANDr_array_of_r_int64ANDr_array_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32ENDEND 2023-04-22 21:14:22.564 : INFO: instruction count: 1: __C3035etypeEncode.__m3038ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:14:22.564 : INFO: instruction count: 35: __C3035etypeEncode.__m3039ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:22.564 : INFO: instruction count: 49: __C3035etypeEncode.__m3040ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:22.564 : INFO: instruction count: 16: __C3035etypeEncode.__m3041ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:14:22.564 : INFO: instruction count: 4: __C3035etypeEncode.__m3042ENCODE_SInt32$_TO_r_int32 2023-04-22 21:14:22.564 : INFO: instruction count: 39: __C3035etypeEncode.__m3043ENCODE_SIndexablePointer_TO_r_array_of_r_int64 2023-04-22 21:14:22.564 : INFO: instruction count: 4: __C3035etypeEncode.__m3044ENCODE_SInt64$_TO_r_int64 2023-04-22 21:14:22.564 : INFO: instruction count: 35: __C3035etypeEncode.__m3045ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32END 2023-04-22 21:14:22.565 : INFO: instruction count: 51: __C3035etypeEncode.__m3046ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32END 2023-04-22 21:14:22.565 : INFO: instruction count: 41: __C3035etypeEncode.__m3047ENCODE_SIntervalPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolEND 2023-04-22 21:14:22.565 : INFO: instruction count: 21: __C3035etypeEncode.__m3048ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:22.565 : INFO: instruction count: 13: __C3035etypeEncode.__m3049ENCODE_SBaseStructPointer_TO_r_struct_of_r_int32END 2023-04-22 21:14:22.565 : INFO: instruction count: 4: __C3035etypeEncode.__m3050ENCODE_SBoolean$_TO_r_bool 2023-04-22 21:14:22.565 : INFO: instruction count: 39: __C3035etypeEncode.__m3051ENCODE_SIndexablePointer_TO_r_array_of_r_int32 2023-04-22 21:14:22.570 MemoryStore: INFO: Block broadcast_188 stored as values in memory (estimated size 1584.0 B, free 25.0 GiB) 2023-04-22 21:14:22.574 MemoryStore: INFO: Block broadcast_188_piece0 stored as bytes in memory (estimated size 719.0 B, free 25.0 GiB) 2023-04-22 21:14:22.574 BlockManagerInfo: INFO: Added broadcast_188_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 719.0 B, free: 25.3 GiB) 2023-04-22 21:14:22.574 SparkContext: INFO: Created broadcast 188 from broadcast at SparkBackend.scala:354 2023-04-22 21:14:22.575 : INFO: instruction count: 3: __C2631HailClassLoaderContainer. 2023-04-22 21:14:22.575 : INFO: instruction count: 3: __C2631HailClassLoaderContainer. 2023-04-22 21:14:22.575 : INFO: instruction count: 3: __C2633FSContainer. 2023-04-22 21:14:22.575 : INFO: instruction count: 3: __C2633FSContainer. 2023-04-22 21:14:22.583 BlockManagerInfo: INFO: Removed broadcast_149_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 152.0 B, free: 25.3 GiB) 2023-04-22 21:14:22.615 : INFO: instruction count: 3: __C2635Compiled. 2023-04-22 21:14:22.616 : INFO: instruction count: 107: __C2635Compiled.apply 2023-04-22 21:14:22.643 : INFO: instruction count: 444: __C2635Compiled.__m2637split_ToArray 2023-04-22 21:14:22.644 : INFO: instruction count: 424: __C2635Compiled.__m2639split_ToArray 2023-04-22 21:14:22.644 : INFO: instruction count: 177: __C2635Compiled.__m2676split_ToArray 2023-04-22 21:14:22.644 : INFO: instruction count: 35: __C2635Compiled.__m2684arrayref_bounds_check 2023-04-22 21:14:22.644 : INFO: instruction count: 11: __C2635Compiled.__m2705ord_equiv 2023-04-22 21:14:22.644 : INFO: instruction count: 14: __C2635Compiled.__m2706ord_equivNonnull 2023-04-22 21:14:22.644 : INFO: instruction count: 4: __C2635Compiled.setBackend 2023-04-22 21:14:22.644 : INFO: instruction count: 9: __C2635Compiled.__m2954ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32ENDEND 2023-04-22 21:14:22.644 : INFO: instruction count: 57: __C2635Compiled.__m2955ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:22.644 : INFO: instruction count: 51: __C2635Compiled.__m2956ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDEND 2023-04-22 21:14:22.644 : INFO: instruction count: 41: __C2635Compiled.__m2957ENCODE_SIntervalPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolEND 2023-04-22 21:14:22.644 : INFO: instruction count: 21: __C2635Compiled.__m2958ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:22.644 : INFO: instruction count: 13: __C2635Compiled.__m2959ENCODE_SBaseStructPointer_TO_r_struct_of_r_int32END 2023-04-22 21:14:22.644 : INFO: instruction count: 4: __C2635Compiled.__m2960ENCODE_SInt32$_TO_r_int32 2023-04-22 21:14:22.644 : INFO: instruction count: 4: __C2635Compiled.__m2961ENCODE_SBoolean$_TO_r_bool 2023-04-22 21:14:22.644 : INFO: instruction count: 35: __C2635Compiled.__m2962ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:22.644 : INFO: instruction count: 33: __C2635Compiled.__m2963ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:22.644 : INFO: instruction count: 49: __C2635Compiled.__m2964ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:22.644 : INFO: instruction count: 16: __C2635Compiled.__m2965ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:14:22.645 : INFO: instruction count: 9: __C2635Compiled.__m2966ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_ENDEND 2023-04-22 21:14:22.645 : INFO: instruction count: 1: __C2635Compiled.__m2967ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:14:22.645 : INFO: instruction count: 27: __C2635Compiled.__m2970DECODE_r_struct_of_r_array_of_r_struct_of_r_int32ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:14:22.645 : INFO: instruction count: 58: __C2635Compiled.__m2971INPLACE_DECODE_r_array_of_r_struct_of_r_int32ANDr_binaryEND_TO_r_array_of_r_tuple_of_r_int32ANDr_stringEND 2023-04-22 21:14:22.645 : INFO: instruction count: 26: __C2635Compiled.__m2972INPLACE_DECODE_r_struct_of_r_int32ANDr_binaryEND_TO_r_tuple_of_r_int32ANDr_stringEND 2023-04-22 21:14:22.645 : INFO: instruction count: 10: __C2635Compiled.__m2973INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:14:22.645 : INFO: instruction count: 31: __C2635Compiled.__m2974INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:14:22.645 : INFO: instruction count: 12: __C2635Compiled.__m2993setup_jab 2023-04-22 21:14:22.645 : INFO: instruction count: 47: __C2635Compiled.__m2996dependent_sorting_func 2023-04-22 21:14:22.645 : INFO: instruction count: 11: __C2635Compiled.__m2999ord_lt 2023-04-22 21:14:22.645 : INFO: instruction count: 14: __C2635Compiled.__m3000ord_ltNonnull 2023-04-22 21:14:22.645 : INFO: instruction count: 202: __C2635Compiled.__m3001arraySorter_outer 2023-04-22 21:14:22.645 : INFO: instruction count: 66: __C2635Compiled.__m3002arraySorter_merge 2023-04-22 21:14:22.645 : INFO: instruction count: 36: __C2635Compiled.__m3003arraySorter_splitMerge 2023-04-22 21:14:22.645 : INFO: instruction count: 9: __C2635Compiled.setPartitionIndex 2023-04-22 21:14:22.645 : INFO: instruction count: 4: __C2635Compiled.addPartitionRegion 2023-04-22 21:14:22.645 : INFO: instruction count: 4: __C2635Compiled.setPool 2023-04-22 21:14:22.645 : INFO: instruction count: 3: __C2635Compiled.addHailClassLoader 2023-04-22 21:14:22.645 : INFO: instruction count: 3: __C2635Compiled.addFS 2023-04-22 21:14:22.646 : INFO: instruction count: 4: __C2635Compiled.addTaskContext 2023-04-22 21:14:22.646 : INFO: instruction count: 3: __C2635Compiled.setObjects 2023-04-22 21:14:22.646 : INFO: instruction count: 133: __C2635Compiled.addAndDecodeLiterals 2023-04-22 21:14:22.646 : INFO: instruction count: 63: __C2635Compiled.__m3018DECODE_r_struct_of_r_struct_of_ENDANDr_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_array_of_r_int64ANDr_array_of_r_int64ANDr_array_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:14:22.646 : INFO: instruction count: 8: __C2635Compiled.__m3019INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:14:22.646 : INFO: instruction count: 58: __C2635Compiled.__m3020INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:22.646 : INFO: instruction count: 44: __C2635Compiled.__m3021INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:22.646 : INFO: instruction count: 58: __C2635Compiled.__m3022INPLACE_DECODE_r_array_of_r_int64_TO_r_array_of_r_int64 2023-04-22 21:14:22.646 : INFO: instruction count: 10: __C2635Compiled.__m3023INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:14:22.646 : INFO: instruction count: 58: __C2635Compiled.__m3024INPLACE_DECODE_r_array_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32END_TO_r_array_of_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_int32END 2023-04-22 21:14:22.646 : INFO: instruction count: 26: __C2635Compiled.__m3025INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32END_TO_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_int32END 2023-04-22 21:14:22.646 : INFO: instruction count: 44: __C2635Compiled.__m3026INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolEND_TO_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:22.646 : INFO: instruction count: 26: __C2635Compiled.__m3027INPLACE_DECODE_r_struct_of_r_struct_of_r_int32ENDANDr_int32END_TO_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:22.646 : INFO: instruction count: 17: __C2635Compiled.__m3028INPLACE_DECODE_r_struct_of_r_int32END_TO_r_struct_of_r_int32END 2023-04-22 21:14:22.646 : INFO: instruction count: 10: __C2635Compiled.__m3029INPLACE_DECODE_r_bool_TO_r_bool 2023-04-22 21:14:22.646 : INFO: instruction count: 58: __C2635Compiled.__m3030INPLACE_DECODE_r_array_of_r_int32_TO_r_array_of_r_int32 2023-04-22 21:14:22.647 : INFO: instruction count: 110: __C2635Compiled.__m2637split_ToArray_region22_24 2023-04-22 21:14:22.647 : INFO: instruction count: 586: __C2635Compiled.__m2637split_ToArray_region16_69 2023-04-22 21:14:22.647 : INFO: instruction count: 3: __C3052__m2637split_ToArraySpills. 2023-04-22 21:14:22.647 : INFO: instruction count: 3: __C2968staticWrapperClass_1. 2023-04-22 21:14:22.651 : INFO: initial IR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3249 (ToStream False (ArraySort __iruid_3250 __iruid_3251 (StreamFlatMap __iruid_3252 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3253 __iruid_3254 (StreamFlatMap __iruid_3255 (StreamZip -1 AssertSameLength (__iruid_3256 __iruid_3257) (Let __iruid_3258 (ToArray (StreamZip -1 AssertSameLength (__iruid_3259 __iruid_3260 __iruid_3261) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3259)) (mwStartIdx (Cast Int32 (Ref __iruid_3260))) (mwStopIdx (Cast Int32 (Ref __iruid_3261)))))) (StreamMap __iruid_3262 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3262))) (oldContexts (ToArray (StreamMap __iruid_3263 (ToStream False (GetField parentPartitions (Ref __iruid_3262))) (ArrayRef -1 (Ref __iruid_3258) (Ref __iruid_3263)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3264 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3256)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3264) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3264) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3264)) (blockRowIdx (Ref __iruid_3257))))) (Ref __iruid_3255)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3265 (Let __iruid_3266 (GetField blockStart (Ref __iruid_3253)) (Let __iruid_3267 (ToArray (StreamMap __iruid_3268 (Let __iruid_3269 (GetField oldTableCtx (Ref __iruid_3253)) (Let __iruid_3270 (GetField partitionBound (Ref __iruid_3269)) (StreamTakeWhile __iruid_3271 (StreamDropWhile __iruid_3272 (StreamFlatMap __iruid_3273 (ToStream True (GetField oldContexts (Ref __iruid_3269))) (StreamZip -1 AssertSameLength (__iruid_3274 __iruid_3275) (StreamMap __iruid_3276 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3273))) (Let __iruid_3277 (ToArray (StreamMap __iruid_3278 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3276))) (Let __iruid_3279 (InsertFields (SelectFields () (Ref __iruid_3278)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3278))))) (If (IsNA (Ref __iruid_3279)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3279))))) (Let __iruid_3280 (StreamAgg __iruid_3281 (StreamFilter __iruid_3282 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3277)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3277) (Ref __iruid_3282))))) (InsertFields (SelectFields () (Ref __iruid_3276)) None (__mean_gt (AggLet __iruid_3283 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3277) (Ref __iruid_3281)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3283))) (Let __iruid_3284 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3283)))))) (Cast Float64 (Ref __iruid_3284)))))))) (InsertFields (Ref __iruid_3280) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3285 (ToStream False (Ref __iruid_3277)) (InsertFields (SelectFields () (Ref __iruid_3285)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3285))) (GetField __mean_gt (Ref __iruid_3280)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3273)) (GetField mwStopIdx (Ref __iruid_3273)) (I32 1)) (InsertFields (Ref __iruid_3274) None (__iruid_2764 (Ref __iruid_3275))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3272)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3270)) (Apply -1 includesStart () Boolean (Ref __iruid_3270)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3271)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3270)) (Apply -1 includesEnd () Boolean (Ref __iruid_3270)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3268))) (rowOfData (ToArray (StreamMap __iruid_3286 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3268)) (Ref __iruid_3266) (ApplyBinaryPrimOp Add (Ref __iruid_3266) (GetField blockSize (Ref __iruid_3253))) (I32 1))) (GetField __uid_6 (Ref __iruid_3286)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3253))) (blockColIdx (GetField blockColIdx (Ref __iruid_3253))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3287 (ToStream False (Ref __iruid_3267)) (ToStream False (GetField rowOfData (Ref __iruid_3287)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3267))) (Cast Int64 (GetField blockSize (Ref __iruid_3253)))) (True))))))) (Let __iruid_3288 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3265)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3265)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3288) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3265)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3288))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3252))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3250)) (GetTupleElement 0 (Ref __iruid_3251))))) (GetTupleElement 1 (Ref __iruid_3249)))))) 2023-04-22 21:14:22.700 BlockManagerInfo: INFO: Removed broadcast_180_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 61.0 B, free: 25.3 GiB) 2023-04-22 21:14:22.753 : INFO: after optimize: compileLowerer, initial IR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3736 (ToStream False (ArraySort __iruid_3737 __iruid_3738 (StreamFlatMap __iruid_3739 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3740 __iruid_3741 (StreamFlatMap __iruid_3742 (StreamZip -1 AssertSameLength (__iruid_3743 __iruid_3744) (Let __iruid_3745 (ToArray (StreamZip -1 AssertSameLength (__iruid_3746 __iruid_3747 __iruid_3748) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3746)) (mwStartIdx (Cast Int32 (Ref __iruid_3747))) (mwStopIdx (Cast Int32 (Ref __iruid_3748)))))) (StreamMap __iruid_3749 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3749))) (oldContexts (ToArray (StreamMap __iruid_3750 (ToStream False (GetField parentPartitions (Ref __iruid_3749))) (ArrayRef -1 (Ref __iruid_3745) (Ref __iruid_3750)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3751 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3743)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3751) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3751) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3751)) (blockRowIdx (Ref __iruid_3744))))) (Ref __iruid_3742)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3752 (Let __iruid_3753 (GetField blockStart (Ref __iruid_3740)) (Let __iruid_3754 (ToArray (StreamMap __iruid_3755 (Let __iruid_3756 (GetField oldTableCtx (Ref __iruid_3740)) (Let __iruid_3757 (GetField partitionBound (Ref __iruid_3756)) (StreamTakeWhile __iruid_3758 (StreamDropWhile __iruid_3759 (StreamFlatMap __iruid_3760 (ToStream True (GetField oldContexts (Ref __iruid_3756))) (StreamZip -1 AssertSameLength (__iruid_3761 __iruid_3762) (StreamMap __iruid_3763 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3760))) (Let __iruid_3764 (ToArray (StreamMap __iruid_3765 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3763))) (Let __iruid_3766 (InsertFields (SelectFields () (Ref __iruid_3765)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3765))))) (If (IsNA (Ref __iruid_3766)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3766))))) (Let __iruid_3767 (StreamAgg __iruid_3768 (StreamFilter __iruid_3769 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3764)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3764) (Ref __iruid_3769))))) (InsertFields (SelectFields () (Ref __iruid_3763)) None (__mean_gt (AggLet __iruid_3770 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3764) (Ref __iruid_3768)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3770))) (Let __iruid_3771 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3770)))))) (Cast Float64 (Ref __iruid_3771)))))))) (InsertFields (Ref __iruid_3767) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3772 (ToStream False (Ref __iruid_3764)) (InsertFields (SelectFields () (Ref __iruid_3772)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3772))) (GetField __mean_gt (Ref __iruid_3767)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3760)) (GetField mwStopIdx (Ref __iruid_3760)) (I32 1)) (InsertFields (Ref __iruid_3761) None (__iruid_2764 (Ref __iruid_3762))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3759)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3757)) (Apply -1 includesStart () Boolean (Ref __iruid_3757)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3758)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3757)) (Apply -1 includesEnd () Boolean (Ref __iruid_3757)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3755))) (rowOfData (ToArray (StreamMap __iruid_3773 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3755)) (Ref __iruid_3753) (ApplyBinaryPrimOp Add (Ref __iruid_3753) (GetField blockSize (Ref __iruid_3740))) (I32 1))) (GetField __uid_6 (Ref __iruid_3773)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3740))) (blockColIdx (GetField blockColIdx (Ref __iruid_3740))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3774 (ToStream False (Ref __iruid_3754)) (ToStream False (GetField rowOfData (Ref __iruid_3774)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3754))) (Cast Int64 (GetField blockSize (Ref __iruid_3740)))) (True))))))) (Let __iruid_3775 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3752)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3752)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3775) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3752)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3775))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3739))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3737)) (GetTupleElement 0 (Ref __iruid_3738))))) (GetTupleElement 1 (Ref __iruid_3736)))))) 2023-04-22 21:14:22.775 : INFO: after InlineApplyIR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3736 (ToStream False (ArraySort __iruid_3737 __iruid_3738 (StreamFlatMap __iruid_3739 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3740 __iruid_3741 (StreamFlatMap __iruid_3742 (StreamZip -1 AssertSameLength (__iruid_3743 __iruid_3744) (Let __iruid_3745 (ToArray (StreamZip -1 AssertSameLength (__iruid_3746 __iruid_3747 __iruid_3748) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3746)) (mwStartIdx (Cast Int32 (Ref __iruid_3747))) (mwStopIdx (Cast Int32 (Ref __iruid_3748)))))) (StreamMap __iruid_3749 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3749))) (oldContexts (ToArray (StreamMap __iruid_3750 (ToStream False (GetField parentPartitions (Ref __iruid_3749))) (ArrayRef -1 (Ref __iruid_3745) (Ref __iruid_3750)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3751 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3743)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3751) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3751) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3751)) (blockRowIdx (Ref __iruid_3744))))) (Ref __iruid_3742)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3752 (Let __iruid_3753 (GetField blockStart (Ref __iruid_3740)) (Let __iruid_3754 (ToArray (StreamMap __iruid_3755 (Let __iruid_3756 (GetField oldTableCtx (Ref __iruid_3740)) (Let __iruid_3757 (GetField partitionBound (Ref __iruid_3756)) (StreamTakeWhile __iruid_3758 (StreamDropWhile __iruid_3759 (StreamFlatMap __iruid_3760 (ToStream True (GetField oldContexts (Ref __iruid_3756))) (StreamZip -1 AssertSameLength (__iruid_3761 __iruid_3762) (StreamMap __iruid_3763 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3760))) (Let __iruid_3764 (ToArray (StreamMap __iruid_3765 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3763))) (Let __iruid_3766 (InsertFields (SelectFields () (Ref __iruid_3765)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3765))))) (If (IsNA (Ref __iruid_3766)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3766))))) (Let __iruid_3767 (StreamAgg __iruid_3768 (StreamFilter __iruid_3769 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3764)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3764) (Ref __iruid_3769))))) (InsertFields (SelectFields () (Ref __iruid_3763)) None (__mean_gt (AggLet __iruid_3770 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3764) (Ref __iruid_3768)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3770))) (Let __iruid_3771 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3770)))))) (Cast Float64 (Ref __iruid_3771)))))))) (InsertFields (Ref __iruid_3767) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3772 (ToStream False (Ref __iruid_3764)) (InsertFields (SelectFields () (Ref __iruid_3772)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3772))) (GetField __mean_gt (Ref __iruid_3767)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3760)) (GetField mwStopIdx (Ref __iruid_3760)) (I32 1)) (InsertFields (Ref __iruid_3761) None (__iruid_2764 (Ref __iruid_3762))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3759)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3757)) (Apply -1 includesStart () Boolean (Ref __iruid_3757)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3758)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3757)) (Apply -1 includesEnd () Boolean (Ref __iruid_3757)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3755))) (rowOfData (ToArray (StreamMap __iruid_3773 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3755)) (Ref __iruid_3753) (ApplyBinaryPrimOp Add (Ref __iruid_3753) (GetField blockSize (Ref __iruid_3740))) (I32 1))) (GetField __uid_6 (Ref __iruid_3773)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3740))) (blockColIdx (GetField blockColIdx (Ref __iruid_3740))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3774 (ToStream False (Ref __iruid_3754)) (ToStream False (GetField rowOfData (Ref __iruid_3774)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3754))) (Cast Int64 (GetField blockSize (Ref __iruid_3740)))) (True))))))) (Let __iruid_3775 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3752)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3752)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3775) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3752)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3775))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3739))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3737)) (GetTupleElement 0 (Ref __iruid_3738))))) (GetTupleElement 1 (Ref __iruid_3736)))))) 2023-04-22 21:14:22.828 BlockManagerInfo: INFO: Removed broadcast_174_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 60.0 B, free: 25.3 GiB) 2023-04-22 21:14:22.883 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3856 (ToStream False (ArraySort __iruid_3857 __iruid_3858 (StreamFlatMap __iruid_3859 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3860 __iruid_3861 (StreamFlatMap __iruid_3862 (StreamZip -1 AssertSameLength (__iruid_3863 __iruid_3864) (Let __iruid_3865 (ToArray (StreamZip -1 AssertSameLength (__iruid_3866 __iruid_3867 __iruid_3868) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3866)) (mwStartIdx (Cast Int32 (Ref __iruid_3867))) (mwStopIdx (Cast Int32 (Ref __iruid_3868)))))) (StreamMap __iruid_3869 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3869))) (oldContexts (ToArray (StreamMap __iruid_3870 (ToStream False (GetField parentPartitions (Ref __iruid_3869))) (ArrayRef -1 (Ref __iruid_3865) (Ref __iruid_3870)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3871 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3863)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3871) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3871) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3871)) (blockRowIdx (Ref __iruid_3864))))) (Ref __iruid_3862)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3872 (Let __iruid_3873 (GetField blockStart (Ref __iruid_3860)) (Let __iruid_3874 (ToArray (StreamMap __iruid_3875 (Let __iruid_3876 (GetField oldTableCtx (Ref __iruid_3860)) (Let __iruid_3877 (GetField partitionBound (Ref __iruid_3876)) (StreamTakeWhile __iruid_3878 (StreamDropWhile __iruid_3879 (StreamFlatMap __iruid_3880 (ToStream True (GetField oldContexts (Ref __iruid_3876))) (StreamZip -1 AssertSameLength (__iruid_3881 __iruid_3882) (StreamMap __iruid_3883 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3880))) (Let __iruid_3884 (ToArray (StreamMap __iruid_3885 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3883))) (Let __iruid_3886 (InsertFields (SelectFields () (Ref __iruid_3885)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3885))))) (If (IsNA (Ref __iruid_3886)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3886))))) (Let __iruid_3887 (StreamAgg __iruid_3888 (StreamFilter __iruid_3889 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3884)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3884) (Ref __iruid_3889))))) (InsertFields (SelectFields () (Ref __iruid_3883)) None (__mean_gt (AggLet __iruid_3890 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3884) (Ref __iruid_3888)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3890))) (Let __iruid_3891 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3890)))))) (Cast Float64 (Ref __iruid_3891)))))))) (InsertFields (Ref __iruid_3887) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3892 (ToStream False (Ref __iruid_3884)) (InsertFields (SelectFields () (Ref __iruid_3892)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3892))) (GetField __mean_gt (Ref __iruid_3887)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3880)) (GetField mwStopIdx (Ref __iruid_3880)) (I32 1)) (InsertFields (Ref __iruid_3881) None (__iruid_2764 (Ref __iruid_3882))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3879)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3877)) (Apply -1 includesStart () Boolean (Ref __iruid_3877)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3878)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3877)) (Apply -1 includesEnd () Boolean (Ref __iruid_3877)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3875))) (rowOfData (ToArray (StreamMap __iruid_3893 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3875)) (Ref __iruid_3873) (ApplyBinaryPrimOp Add (Ref __iruid_3873) (GetField blockSize (Ref __iruid_3860))) (I32 1))) (GetField __uid_6 (Ref __iruid_3893)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3860))) (blockColIdx (GetField blockColIdx (Ref __iruid_3860))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3894 (ToStream False (Ref __iruid_3874)) (ToStream False (GetField rowOfData (Ref __iruid_3894)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3874))) (Cast Int64 (GetField blockSize (Ref __iruid_3860)))) (True))))))) (Let __iruid_3895 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3872)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3872)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3895) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3872)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3895))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3859))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3857)) (GetTupleElement 0 (Ref __iruid_3858))))) (GetTupleElement 1 (Ref __iruid_3856)))))) 2023-04-22 21:14:22.914 : INFO: after LowerArrayAggsToRunAggs: IR size 261: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3856 (ToStream False (ArraySort __iruid_3857 __iruid_3858 (StreamFlatMap __iruid_3859 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3860 __iruid_3861 (StreamFlatMap __iruid_3862 (StreamZip -1 AssertSameLength (__iruid_3863 __iruid_3864) (Let __iruid_3865 (ToArray (StreamZip -1 AssertSameLength (__iruid_3866 __iruid_3867 __iruid_3868) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3866)) (mwStartIdx (Cast Int32 (Ref __iruid_3867))) (mwStopIdx (Cast Int32 (Ref __iruid_3868)))))) (StreamMap __iruid_3869 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3869))) (oldContexts (ToArray (StreamMap __iruid_3870 (ToStream False (GetField parentPartitions (Ref __iruid_3869))) (ArrayRef -1 (Ref __iruid_3865) (Ref __iruid_3870)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3871 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3863)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3871) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3871) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3871)) (blockRowIdx (Ref __iruid_3864))))) (Ref __iruid_3862)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3872 (Let __iruid_3873 (GetField blockStart (Ref __iruid_3860)) (Let __iruid_3874 (ToArray (StreamMap __iruid_3875 (Let __iruid_3876 (GetField oldTableCtx (Ref __iruid_3860)) (Let __iruid_3877 (GetField partitionBound (Ref __iruid_3876)) (StreamTakeWhile __iruid_3878 (StreamDropWhile __iruid_3879 (StreamFlatMap __iruid_3880 (ToStream True (GetField oldContexts (Ref __iruid_3876))) (StreamZip -1 AssertSameLength (__iruid_3881 __iruid_3882) (StreamMap __iruid_3883 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3880))) (Let __iruid_3884 (ToArray (StreamMap __iruid_3885 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3883))) (Let __iruid_3886 (InsertFields (SelectFields () (Ref __iruid_3885)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3885))))) (If (IsNA (Ref __iruid_3886)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3886))))) (Let __iruid_3887 (Let __iruid_3896 (RunAgg ((TypedStateSig +PFloat64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PFloat64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_3888 (StreamFilter __iruid_3889 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3884)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3884) (Ref __iruid_3889))))) (Let __iruid_3890 (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3884) (Ref __iruid_3888)))) (Begin (SeqOp 0 (Sum (TypedStateSig +PFloat64)) ((Ref __iruid_3890))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3890)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PFloat64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields () (Ref __iruid_3883)) None (__mean_gt (ApplyBinaryPrimOp FloatingPointDivide (GetTupleElement 0 (Ref __iruid_3896)) (Let __iruid_3891 (GetTupleElement 1 (Ref __iruid_3896)) (Cast Float64 (Ref __iruid_3891))))))) (InsertFields (Ref __iruid_3887) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3892 (ToStream False (Ref __iruid_3884)) (InsertFields (SelectFields () (Ref __iruid_3892)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3892))) (GetField __mean_gt (Ref __iruid_3887)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3880)) (GetField mwStopIdx (Ref __iruid_3880)) (I32 1)) (InsertFields (Ref __iruid_3881) None (__iruid_2764 (Ref __iruid_3882))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3879)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3877)) (Apply -1 includesStart () Boolean (Ref __iruid_3877)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3878)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3877)) (Apply -1 includesEnd () Boolean (Ref __iruid_3877)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3875))) (rowOfData (ToArray (StreamMap __iruid_3893 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3875)) (Ref __iruid_3873) (ApplyBinaryPrimOp Add (Ref __iruid_3873) (GetField blockSize (Ref __iruid_3860))) (I32 1))) (GetField __uid_6 (Ref __iruid_3893)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3860))) (blockColIdx (GetField blockColIdx (Ref __iruid_3860))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3894 (ToStream False (Ref __iruid_3874)) (ToStream False (GetField rowOfData (Ref __iruid_3894)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3874))) (Cast Int64 (GetField blockSize (Ref __iruid_3860)))) (True))))))) (Let __iruid_3895 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3872)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3872)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3895) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3872)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3895))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3859))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3857)) (GetTupleElement 0 (Ref __iruid_3858))))) (GetTupleElement 1 (Ref __iruid_3856)))))) 2023-04-22 21:14:22.938 BlockManagerInfo: INFO: Removed broadcast_154_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 152.0 B, free: 25.3 GiB) 2023-04-22 21:14:22.956 : INFO: Prune: InsertFields: eliminating field '__mean_gt' 2023-04-22 21:14:23.044 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 256: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3982 (ToStream False (ArraySort __iruid_3983 __iruid_3984 (StreamFlatMap __iruid_3985 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3986 __iruid_3987 (StreamFlatMap __iruid_3988 (StreamZip -1 AssertSameLength (__iruid_3989 __iruid_3990) (Let __iruid_3991 (ToArray (StreamZip -1 AssertSameLength (__iruid_3992 __iruid_3993 __iruid_3994) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3992)) (mwStartIdx (Cast Int32 (Ref __iruid_3993))) (mwStopIdx (Cast Int32 (Ref __iruid_3994)))))) (StreamMap __iruid_3995 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3995))) (oldContexts (ToArray (StreamMap __iruid_3996 (ToStream False (GetField parentPartitions (Ref __iruid_3995))) (ArrayRef -1 (Ref __iruid_3991) (Ref __iruid_3996)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3997 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3989)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3997) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3997) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3997)) (blockRowIdx (Ref __iruid_3990))))) (Ref __iruid_3988)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3998 (Let __iruid_3999 (GetField blockStart (Ref __iruid_3986)) (Let __iruid_4000 (ToArray (StreamMap __iruid_4001 (Let __iruid_4002 (GetField oldTableCtx (Ref __iruid_3986)) (Let __iruid_4003 (GetField partitionBound (Ref __iruid_4002)) (StreamTakeWhile __iruid_4004 (StreamDropWhile __iruid_4005 (StreamFlatMap __iruid_4006 (ToStream True (GetField oldContexts (Ref __iruid_4002))) (StreamZip -1 AssertSameLength (__iruid_4007 __iruid_4008) (StreamMap __iruid_4009 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_4006))) (Let __iruid_4010 (ToArray (StreamMap __iruid_4011 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4009))) (Let __iruid_4012 (InsertFields (SelectFields () (Ref __iruid_4011)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_4011))))) (If (IsNA (Ref __iruid_4012)) (Literal Struct{__gt:Int32} ) (Ref __iruid_4012))))) (Let __iruid_4013 (RunAgg ((TypedStateSig +PFloat64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PFloat64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_4014 (StreamFilter __iruid_4015 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4010)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_4010) (Ref __iruid_4015))))) (Let __iruid_4016 (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_4010) (Ref __iruid_4014)))) (Begin (SeqOp 0 (Sum (TypedStateSig +PFloat64)) ((Ref __iruid_4016))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_4016)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PFloat64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (Let __iruid_4017 (ApplyBinaryPrimOp FloatingPointDivide (GetTupleElement 0 (Ref __iruid_4013)) (Cast Float64 (GetTupleElement 1 (Ref __iruid_4013)))) (InsertFields (SelectFields () (Ref __iruid_4009)) ( "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_4018 (ToStream False (Ref __iruid_4010)) (InsertFields (SelectFields () (Ref __iruid_4018)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_4018))) (Ref __iruid_4017)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_4006)) (GetField mwStopIdx (Ref __iruid_4006)) (I32 1)) (InsertFields (Ref __iruid_4007) None (__iruid_2764 (Ref __iruid_4008))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4005)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4003)) (Apply -1 includesStart () Boolean (Ref __iruid_4003)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4004)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4003)) (Apply -1 includesEnd () Boolean (Ref __iruid_4003)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_4001))) (rowOfData (ToArray (StreamMap __iruid_4019 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4001)) (Ref __iruid_3999) (ApplyBinaryPrimOp Add (Ref __iruid_3999) (GetField blockSize (Ref __iruid_3986))) (I32 1))) (GetField __uid_6 (Ref __iruid_4019)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3986))) (blockColIdx (GetField blockColIdx (Ref __iruid_3986))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_4020 (ToStream False (Ref __iruid_4000)) (ToStream False (GetField rowOfData (Ref __iruid_4020)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_4000))) (Cast Int64 (GetField blockSize (Ref __iruid_3986)))) (True))))))) (Let __iruid_4021 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3998)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3998)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_4021) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3998)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_4021))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3985))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3983)) (GetTupleElement 0 (Ref __iruid_3984))))) (GetTupleElement 1 (Ref __iruid_3982)))))) 2023-04-22 21:14:23.047 BlockManagerInfo: INFO: Removed broadcast_178_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 61.0 B, free: 25.3 GiB) 2023-04-22 21:14:23.155 : INFO: encoder cache hit 2023-04-22 21:14:23.156 MemoryStore: INFO: Block broadcast_189 stored as values in memory (estimated size 200.0 B, free 25.0 GiB) 2023-04-22 21:14:23.157 MemoryStore: INFO: Block broadcast_189_piece0 stored as bytes in memory (estimated size 155.0 B, free 25.0 GiB) 2023-04-22 21:14:23.166 BlockManagerInfo: INFO: Added broadcast_189_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 155.0 B, free: 25.3 GiB) 2023-04-22 21:14:23.166 BlockManagerInfo: INFO: Removed broadcast_145_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 1087.8 KiB, free: 25.3 GiB) 2023-04-22 21:14:23.167 SparkContext: INFO: Created broadcast 189 from broadcast at SparkBackend.scala:354 2023-04-22 21:14:23.167 : INFO: instruction count: 3: __C3141HailClassLoaderContainer. 2023-04-22 21:14:23.167 : INFO: instruction count: 3: __C3141HailClassLoaderContainer. 2023-04-22 21:14:23.167 : INFO: instruction count: 3: __C3143FSContainer. 2023-04-22 21:14:23.167 : INFO: instruction count: 3: __C3143FSContainer. 2023-04-22 21:14:23.218 : INFO: instruction count: 3: __C3145collect_distributed_array_matrix_block_matrix_writer. 2023-04-22 21:14:23.218 : INFO: instruction count: 111: __C3145collect_distributed_array_matrix_block_matrix_writer.apply 2023-04-22 21:14:23.218 : INFO: instruction count: 17: __C3145collect_distributed_array_matrix_block_matrix_writer.apply 2023-04-22 21:14:23.218 : INFO: instruction count: 27: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3147DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:14:23.218 : INFO: instruction count: 53: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3148INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:23.218 : INFO: instruction count: 26: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3149INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDEND_TO_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDEND 2023-04-22 21:14:23.218 : INFO: instruction count: 44: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3150INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolEND_TO_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:23.218 : INFO: instruction count: 26: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3151INPLACE_DECODE_r_struct_of_r_struct_of_r_int32ENDANDr_int32END_TO_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:23.219 : INFO: instruction count: 17: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3152INPLACE_DECODE_r_struct_of_r_int32END_TO_r_struct_of_r_int32END 2023-04-22 21:14:23.219 : INFO: instruction count: 10: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3153INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:14:23.219 : INFO: instruction count: 10: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3154INPLACE_DECODE_r_bool_TO_r_bool 2023-04-22 21:14:23.219 : INFO: instruction count: 58: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3155INPLACE_DECODE_r_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:23.219 : INFO: instruction count: 35: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3156INPLACE_DECODE_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END_TO_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:23.219 : INFO: instruction count: 44: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3157INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:23.219 : INFO: instruction count: 31: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3158INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:14:23.219 : INFO: instruction count: 27: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3160DECODE_r_struct_of_r_struct_of_ENDEND_TO_SBaseStructPointer 2023-04-22 21:14:23.219 : INFO: instruction count: 8: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3161INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:14:23.219 : INFO: instruction count: 483: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3163split_ToArray 2023-04-22 21:14:23.220 : INFO: instruction count: 699: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3166split_ToArray 2023-04-22 21:14:23.220 : INFO: instruction count: 256: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3187split_ToArray 2023-04-22 21:14:23.220 : INFO: instruction count: 8: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3195nNonRefAlleles 2023-04-22 21:14:23.220 : INFO: instruction count: 9: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3210begin_group_0 2023-04-22 21:14:23.220 : INFO: instruction count: 17: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3211begin_group_0 2023-04-22 21:14:23.221 : INFO: instruction count: 158: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3212split_StreamFor 2023-04-22 21:14:23.221 : INFO: instruction count: 35: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3220arrayref_bounds_check 2023-04-22 21:14:23.221 : INFO: instruction count: 70: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3224begin_group_0 2023-04-22 21:14:23.221 : INFO: instruction count: 5: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3227toInt64 2023-04-22 21:14:23.221 : INFO: instruction count: 170: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3234split_ToArray 2023-04-22 21:14:23.221 : INFO: instruction count: 2: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3268includesStart 2023-04-22 21:14:23.221 : INFO: instruction count: 111: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3269pointLessThanPartitionIntervalLeftEndpoint 2023-04-22 21:14:23.221 : INFO: instruction count: 11: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3270ord_compare 2023-04-22 21:14:23.221 : INFO: instruction count: 8: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3271ord_compareNonnull 2023-04-22 21:14:23.221 : INFO: instruction count: 2: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3277includesEnd 2023-04-22 21:14:23.221 : INFO: instruction count: 102: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3278pointLessThanPartitionIntervalRightEndpoint 2023-04-22 21:14:23.222 : INFO: instruction count: 364: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3284split_ToArray 2023-04-22 21:14:23.222 : INFO: instruction count: 12: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3297setup_jab 2023-04-22 21:14:23.222 : INFO: instruction count: 18: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3328str 2023-04-22 21:14:23.222 : INFO: instruction count: 29: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3329concat 2023-04-22 21:14:23.222 : INFO: instruction count: 29: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3330concat 2023-04-22 21:14:23.222 : INFO: instruction count: 68: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3335ENCODE_SNDArrayPointer_TO_r_ndarray_of_o_float64 2023-04-22 21:14:23.222 : INFO: instruction count: 4: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3336ENCODE_SFloat64$_TO_o_float64 2023-04-22 21:14:23.222 : INFO: instruction count: 25: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3340ENCODE_SBaseStructPointer_TO_r_struct_of_r_array_of_r_struct_of_r_int32ANDr_binaryENDEND 2023-04-22 21:14:23.222 : INFO: instruction count: 35: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3341ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_int32ANDr_binaryEND 2023-04-22 21:14:23.222 : INFO: instruction count: 25: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3342ENCODE_SBaseStructPointer_TO_r_struct_of_r_int32ANDr_binaryEND 2023-04-22 21:14:23.222 : INFO: instruction count: 4: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3343ENCODE_SInt32$_TO_r_int32 2023-04-22 21:14:23.222 : INFO: instruction count: 16: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3344ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:14:23.222 : INFO: instruction count: 9: __C3145collect_distributed_array_matrix_block_matrix_writer.setPartitionIndex 2023-04-22 21:14:23.222 : INFO: instruction count: 4: __C3145collect_distributed_array_matrix_block_matrix_writer.addPartitionRegion 2023-04-22 21:14:23.222 : INFO: instruction count: 4: __C3145collect_distributed_array_matrix_block_matrix_writer.setPool 2023-04-22 21:14:23.222 : INFO: instruction count: 3: __C3145collect_distributed_array_matrix_block_matrix_writer.addHailClassLoader 2023-04-22 21:14:23.222 : INFO: instruction count: 3: __C3145collect_distributed_array_matrix_block_matrix_writer.addFS 2023-04-22 21:14:23.222 : INFO: instruction count: 4: __C3145collect_distributed_array_matrix_block_matrix_writer.addTaskContext 2023-04-22 21:14:23.222 : INFO: instruction count: 3: __C3145collect_distributed_array_matrix_block_matrix_writer.setObjects 2023-04-22 21:14:23.223 : INFO: instruction count: 67: __C3145collect_distributed_array_matrix_block_matrix_writer.addAndDecodeLiterals 2023-04-22 21:14:23.223 : INFO: instruction count: 45: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3348DECODE_r_struct_of_r_binaryANDr_struct_of_o_int32ENDANDr_binaryEND_TO_SBaseStructPointer 2023-04-22 21:14:23.223 : INFO: instruction count: 48: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3349INPLACE_DECODE_r_struct_of_o_int32END_TO_r_struct_of_o_int32END 2023-04-22 21:14:23.223 : INFO: instruction count: 10: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3350INPLACE_DECODE_o_int32_TO_o_int32 2023-04-22 21:14:23.223 : INFO: instruction count: 384: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3163split_ToArray_region20_22 2023-04-22 21:14:23.223 : INFO: instruction count: 216: __C3145collect_distributed_array_matrix_block_matrix_writer.__m3166split_ToArray_region7_14 2023-04-22 21:14:23.223 : INFO: instruction count: 3: __C3351__m3163split_ToArraySpills. 2023-04-22 21:14:23.223 : INFO: instruction count: 3: __C3365__m3166split_ToArraySpills. 2023-04-22 21:14:23.223 : INFO: instruction count: 3: __C3260staticWrapperClass_1. 2023-04-22 21:14:23.284 BlockManagerInfo: INFO: Removed broadcast_171_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 61.0 B, free: 25.3 GiB) 2023-04-22 21:14:23.287 : INFO: encoder cache hit 2023-04-22 21:14:23.287 MemoryStore: INFO: Block broadcast_190 stored as values in memory (estimated size 1584.0 B, free 25.1 GiB) 2023-04-22 21:14:23.289 MemoryStore: INFO: Block broadcast_190_piece0 stored as bytes in memory (estimated size 719.0 B, free 25.1 GiB) 2023-04-22 21:14:23.290 BlockManagerInfo: INFO: Added broadcast_190_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 719.0 B, free: 25.3 GiB) 2023-04-22 21:14:23.290 SparkContext: INFO: Created broadcast 190 from broadcast at SparkBackend.scala:354 2023-04-22 21:14:23.290 : INFO: instruction count: 3: __C3058HailClassLoaderContainer. 2023-04-22 21:14:23.290 : INFO: instruction count: 3: __C3058HailClassLoaderContainer. 2023-04-22 21:14:23.290 : INFO: instruction count: 3: __C3060FSContainer. 2023-04-22 21:14:23.291 : INFO: instruction count: 3: __C3060FSContainer. 2023-04-22 21:14:23.326 : INFO: instruction count: 3: __C3062Compiled. 2023-04-22 21:14:23.326 : INFO: instruction count: 107: __C3062Compiled.apply 2023-04-22 21:14:23.327 : INFO: instruction count: 444: __C3062Compiled.__m3064split_ToArray 2023-04-22 21:14:23.327 : INFO: instruction count: 424: __C3062Compiled.__m3066split_ToArray 2023-04-22 21:14:23.328 : INFO: instruction count: 177: __C3062Compiled.__m3103split_ToArray 2023-04-22 21:14:23.328 : INFO: instruction count: 35: __C3062Compiled.__m3111arrayref_bounds_check 2023-04-22 21:14:23.328 : INFO: instruction count: 11: __C3062Compiled.__m3132ord_equiv 2023-04-22 21:14:23.328 : INFO: instruction count: 14: __C3062Compiled.__m3133ord_equivNonnull 2023-04-22 21:14:23.328 : INFO: instruction count: 4: __C3062Compiled.setBackend 2023-04-22 21:14:23.328 : INFO: instruction count: 9: __C3062Compiled.__m3371ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32ENDEND 2023-04-22 21:14:23.328 : INFO: instruction count: 57: __C3062Compiled.__m3372ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:23.328 : INFO: instruction count: 51: __C3062Compiled.__m3373ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDEND 2023-04-22 21:14:23.328 : INFO: instruction count: 41: __C3062Compiled.__m3374ENCODE_SIntervalPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolEND 2023-04-22 21:14:23.328 : INFO: instruction count: 21: __C3062Compiled.__m3375ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:23.328 : INFO: instruction count: 13: __C3062Compiled.__m3376ENCODE_SBaseStructPointer_TO_r_struct_of_r_int32END 2023-04-22 21:14:23.328 : INFO: instruction count: 4: __C3062Compiled.__m3377ENCODE_SInt32$_TO_r_int32 2023-04-22 21:14:23.328 : INFO: instruction count: 4: __C3062Compiled.__m3378ENCODE_SBoolean$_TO_r_bool 2023-04-22 21:14:23.328 : INFO: instruction count: 35: __C3062Compiled.__m3379ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:23.328 : INFO: instruction count: 33: __C3062Compiled.__m3380ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:23.328 : INFO: instruction count: 49: __C3062Compiled.__m3381ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:23.328 : INFO: instruction count: 16: __C3062Compiled.__m3382ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:14:23.328 : INFO: instruction count: 9: __C3062Compiled.__m3383ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_ENDEND 2023-04-22 21:14:23.328 : INFO: instruction count: 1: __C3062Compiled.__m3384ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:14:23.329 : INFO: instruction count: 27: __C3062Compiled.__m3387DECODE_r_struct_of_r_array_of_r_struct_of_r_int32ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:14:23.329 : INFO: instruction count: 58: __C3062Compiled.__m3388INPLACE_DECODE_r_array_of_r_struct_of_r_int32ANDr_binaryEND_TO_r_array_of_r_tuple_of_r_int32ANDr_stringEND 2023-04-22 21:14:23.329 : INFO: instruction count: 26: __C3062Compiled.__m3389INPLACE_DECODE_r_struct_of_r_int32ANDr_binaryEND_TO_r_tuple_of_r_int32ANDr_stringEND 2023-04-22 21:14:23.329 : INFO: instruction count: 10: __C3062Compiled.__m3390INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:14:23.329 : INFO: instruction count: 31: __C3062Compiled.__m3391INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:14:23.329 : INFO: instruction count: 12: __C3062Compiled.__m3410setup_jab 2023-04-22 21:14:23.329 : INFO: instruction count: 47: __C3062Compiled.__m3413dependent_sorting_func 2023-04-22 21:14:23.329 : INFO: instruction count: 11: __C3062Compiled.__m3416ord_lt 2023-04-22 21:14:23.329 : INFO: instruction count: 14: __C3062Compiled.__m3417ord_ltNonnull 2023-04-22 21:14:23.329 : INFO: instruction count: 202: __C3062Compiled.__m3418arraySorter_outer 2023-04-22 21:14:23.329 : INFO: instruction count: 66: __C3062Compiled.__m3419arraySorter_merge 2023-04-22 21:14:23.329 : INFO: instruction count: 36: __C3062Compiled.__m3420arraySorter_splitMerge 2023-04-22 21:14:23.329 : INFO: instruction count: 9: __C3062Compiled.setPartitionIndex 2023-04-22 21:14:23.329 : INFO: instruction count: 4: __C3062Compiled.addPartitionRegion 2023-04-22 21:14:23.329 : INFO: instruction count: 4: __C3062Compiled.setPool 2023-04-22 21:14:23.329 : INFO: instruction count: 3: __C3062Compiled.addHailClassLoader 2023-04-22 21:14:23.329 : INFO: instruction count: 3: __C3062Compiled.addFS 2023-04-22 21:14:23.329 : INFO: instruction count: 4: __C3062Compiled.addTaskContext 2023-04-22 21:14:23.329 : INFO: instruction count: 3: __C3062Compiled.setObjects 2023-04-22 21:14:23.330 : INFO: instruction count: 133: __C3062Compiled.addAndDecodeLiterals 2023-04-22 21:14:23.332 : INFO: instruction count: 63: __C3062Compiled.__m3435DECODE_r_struct_of_r_struct_of_ENDANDr_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_array_of_r_int64ANDr_array_of_r_int64ANDr_array_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:14:23.332 : INFO: instruction count: 8: __C3062Compiled.__m3436INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:14:23.332 : INFO: instruction count: 58: __C3062Compiled.__m3437INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:23.332 : INFO: instruction count: 44: __C3062Compiled.__m3438INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:23.332 : INFO: instruction count: 58: __C3062Compiled.__m3439INPLACE_DECODE_r_array_of_r_int64_TO_r_array_of_r_int64 2023-04-22 21:14:23.332 : INFO: instruction count: 10: __C3062Compiled.__m3440INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:14:23.332 : INFO: instruction count: 58: __C3062Compiled.__m3441INPLACE_DECODE_r_array_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32END_TO_r_array_of_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_int32END 2023-04-22 21:14:23.332 : INFO: instruction count: 26: __C3062Compiled.__m3442INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32END_TO_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_int32END 2023-04-22 21:14:23.332 : INFO: instruction count: 44: __C3062Compiled.__m3443INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolEND_TO_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:23.332 : INFO: instruction count: 26: __C3062Compiled.__m3444INPLACE_DECODE_r_struct_of_r_struct_of_r_int32ENDANDr_int32END_TO_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:23.332 : INFO: instruction count: 17: __C3062Compiled.__m3445INPLACE_DECODE_r_struct_of_r_int32END_TO_r_struct_of_r_int32END 2023-04-22 21:14:23.332 : INFO: instruction count: 10: __C3062Compiled.__m3446INPLACE_DECODE_r_bool_TO_r_bool 2023-04-22 21:14:23.332 : INFO: instruction count: 58: __C3062Compiled.__m3447INPLACE_DECODE_r_array_of_r_int32_TO_r_array_of_r_int32 2023-04-22 21:14:23.333 : INFO: instruction count: 110: __C3062Compiled.__m3064split_ToArray_region22_24 2023-04-22 21:14:23.333 : INFO: instruction count: 586: __C3062Compiled.__m3064split_ToArray_region16_69 2023-04-22 21:14:23.333 : INFO: instruction count: 3: __C3448__m3064split_ToArraySpills. 2023-04-22 21:14:23.333 : INFO: instruction count: 3: __C3385staticWrapperClass_1. 2023-04-22 21:14:23.346 : INFO: initial IR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_3249 (ToStream False (ArraySort __iruid_3250 __iruid_3251 (StreamFlatMap __iruid_3252 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_3253 __iruid_3254 (StreamFlatMap __iruid_3255 (StreamZip -1 AssertSameLength (__iruid_3256 __iruid_3257) (Let __iruid_3258 (ToArray (StreamZip -1 AssertSameLength (__iruid_3259 __iruid_3260 __iruid_3261) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_3259)) (mwStartIdx (Cast Int32 (Ref __iruid_3260))) (mwStopIdx (Cast Int32 (Ref __iruid_3261)))))) (StreamMap __iruid_3262 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_3262))) (oldContexts (ToArray (StreamMap __iruid_3263 (ToStream False (GetField parentPartitions (Ref __iruid_3262))) (ArrayRef -1 (Ref __iruid_3258) (Ref __iruid_3263)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_3264 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_3256)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_3264) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_3264) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_3264)) (blockRowIdx (Ref __iruid_3257))))) (Ref __iruid_3255)) (Literal Struct{} ) (ToArray (StreamMap __iruid_3265 (Let __iruid_3266 (GetField blockStart (Ref __iruid_3253)) (Let __iruid_3267 (ToArray (StreamMap __iruid_3268 (Let __iruid_3269 (GetField oldTableCtx (Ref __iruid_3253)) (Let __iruid_3270 (GetField partitionBound (Ref __iruid_3269)) (StreamTakeWhile __iruid_3271 (StreamDropWhile __iruid_3272 (StreamFlatMap __iruid_3273 (ToStream True (GetField oldContexts (Ref __iruid_3269))) (StreamZip -1 AssertSameLength (__iruid_3274 __iruid_3275) (StreamMap __iruid_3276 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_3273))) (Let __iruid_3277 (ToArray (StreamMap __iruid_3278 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3276))) (Let __iruid_3279 (InsertFields (SelectFields () (Ref __iruid_3278)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_3278))))) (If (IsNA (Ref __iruid_3279)) (Literal Struct{__gt:Int32} ) (Ref __iruid_3279))))) (Let __iruid_3280 (StreamAgg __iruid_3281 (StreamFilter __iruid_3282 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_3277)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_3277) (Ref __iruid_3282))))) (InsertFields (SelectFields () (Ref __iruid_3276)) None (__mean_gt (AggLet __iruid_3283 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_3277) (Ref __iruid_3281)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_3283))) (Let __iruid_3284 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_3283)))))) (Cast Float64 (Ref __iruid_3284)))))))) (InsertFields (Ref __iruid_3280) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_3285 (ToStream False (Ref __iruid_3277)) (InsertFields (SelectFields () (Ref __iruid_3285)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_3285))) (GetField __mean_gt (Ref __iruid_3280)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_3273)) (GetField mwStopIdx (Ref __iruid_3273)) (I32 1)) (InsertFields (Ref __iruid_3274) None (__iruid_2764 (Ref __iruid_3275))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3272)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3270)) (Apply -1 includesStart () Boolean (Ref __iruid_3270)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_3271)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_3270)) (Apply -1 includesEnd () Boolean (Ref __iruid_3270)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_3268))) (rowOfData (ToArray (StreamMap __iruid_3286 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_3268)) (Ref __iruid_3266) (ApplyBinaryPrimOp Add (Ref __iruid_3266) (GetField blockSize (Ref __iruid_3253))) (I32 1))) (GetField __uid_6 (Ref __iruid_3286)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_3253))) (blockColIdx (GetField blockColIdx (Ref __iruid_3253))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_3287 (ToStream False (Ref __iruid_3267)) (ToStream False (GetField rowOfData (Ref __iruid_3287)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_3267))) (Cast Int64 (GetField blockSize (Ref __iruid_3253)))) (True))))))) (Let __iruid_3288 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_3265)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_3265)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_3288) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_3265)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_3288))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_3252))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_3250)) (GetTupleElement 0 (Ref __iruid_3251))))) (GetTupleElement 1 (Ref __iruid_3249)))))) 2023-04-22 21:14:23.407 BlockManagerInfo: INFO: Removed broadcast_167_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 351.3 KiB, free: 25.3 GiB) 2023-04-22 21:14:23.455 : INFO: after optimize: compileLowerer, initial IR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_4103 (ToStream False (ArraySort __iruid_4104 __iruid_4105 (StreamFlatMap __iruid_4106 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_4107 __iruid_4108 (StreamFlatMap __iruid_4109 (StreamZip -1 AssertSameLength (__iruid_4110 __iruid_4111) (Let __iruid_4112 (ToArray (StreamZip -1 AssertSameLength (__iruid_4113 __iruid_4114 __iruid_4115) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_4113)) (mwStartIdx (Cast Int32 (Ref __iruid_4114))) (mwStopIdx (Cast Int32 (Ref __iruid_4115)))))) (StreamMap __iruid_4116 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_4116))) (oldContexts (ToArray (StreamMap __iruid_4117 (ToStream False (GetField parentPartitions (Ref __iruid_4116))) (ArrayRef -1 (Ref __iruid_4112) (Ref __iruid_4117)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_4118 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_4110)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_4118) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_4118) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_4118)) (blockRowIdx (Ref __iruid_4111))))) (Ref __iruid_4109)) (Literal Struct{} ) (ToArray (StreamMap __iruid_4119 (Let __iruid_4120 (GetField blockStart (Ref __iruid_4107)) (Let __iruid_4121 (ToArray (StreamMap __iruid_4122 (Let __iruid_4123 (GetField oldTableCtx (Ref __iruid_4107)) (Let __iruid_4124 (GetField partitionBound (Ref __iruid_4123)) (StreamTakeWhile __iruid_4125 (StreamDropWhile __iruid_4126 (StreamFlatMap __iruid_4127 (ToStream True (GetField oldContexts (Ref __iruid_4123))) (StreamZip -1 AssertSameLength (__iruid_4128 __iruid_4129) (StreamMap __iruid_4130 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_4127))) (Let __iruid_4131 (ToArray (StreamMap __iruid_4132 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4130))) (Let __iruid_4133 (InsertFields (SelectFields () (Ref __iruid_4132)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_4132))))) (If (IsNA (Ref __iruid_4133)) (Literal Struct{__gt:Int32} ) (Ref __iruid_4133))))) (Let __iruid_4134 (StreamAgg __iruid_4135 (StreamFilter __iruid_4136 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4131)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_4131) (Ref __iruid_4136))))) (InsertFields (SelectFields () (Ref __iruid_4130)) None (__mean_gt (AggLet __iruid_4137 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_4131) (Ref __iruid_4135)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_4137))) (Let __iruid_4138 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_4137)))))) (Cast Float64 (Ref __iruid_4138)))))))) (InsertFields (Ref __iruid_4134) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_4139 (ToStream False (Ref __iruid_4131)) (InsertFields (SelectFields () (Ref __iruid_4139)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_4139))) (GetField __mean_gt (Ref __iruid_4134)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_4127)) (GetField mwStopIdx (Ref __iruid_4127)) (I32 1)) (InsertFields (Ref __iruid_4128) None (__iruid_2764 (Ref __iruid_4129))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4126)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4124)) (Apply -1 includesStart () Boolean (Ref __iruid_4124)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4125)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4124)) (Apply -1 includesEnd () Boolean (Ref __iruid_4124)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_4122))) (rowOfData (ToArray (StreamMap __iruid_4140 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4122)) (Ref __iruid_4120) (ApplyBinaryPrimOp Add (Ref __iruid_4120) (GetField blockSize (Ref __iruid_4107))) (I32 1))) (GetField __uid_6 (Ref __iruid_4140)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_4107))) (blockColIdx (GetField blockColIdx (Ref __iruid_4107))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_4141 (ToStream False (Ref __iruid_4121)) (ToStream False (GetField rowOfData (Ref __iruid_4141)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_4121))) (Cast Int64 (GetField blockSize (Ref __iruid_4107)))) (True))))))) (Let __iruid_4142 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_4119)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_4119)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_4142) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_4119)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_4142))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_4106))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_4104)) (GetTupleElement 0 (Ref __iruid_4105))))) (GetTupleElement 1 (Ref __iruid_4103)))))) 2023-04-22 21:14:23.479 : INFO: after InlineApplyIR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_4103 (ToStream False (ArraySort __iruid_4104 __iruid_4105 (StreamFlatMap __iruid_4106 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_4107 __iruid_4108 (StreamFlatMap __iruid_4109 (StreamZip -1 AssertSameLength (__iruid_4110 __iruid_4111) (Let __iruid_4112 (ToArray (StreamZip -1 AssertSameLength (__iruid_4113 __iruid_4114 __iruid_4115) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_4113)) (mwStartIdx (Cast Int32 (Ref __iruid_4114))) (mwStopIdx (Cast Int32 (Ref __iruid_4115)))))) (StreamMap __iruid_4116 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_4116))) (oldContexts (ToArray (StreamMap __iruid_4117 (ToStream False (GetField parentPartitions (Ref __iruid_4116))) (ArrayRef -1 (Ref __iruid_4112) (Ref __iruid_4117)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_4118 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_4110)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_4118) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_4118) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_4118)) (blockRowIdx (Ref __iruid_4111))))) (Ref __iruid_4109)) (Literal Struct{} ) (ToArray (StreamMap __iruid_4119 (Let __iruid_4120 (GetField blockStart (Ref __iruid_4107)) (Let __iruid_4121 (ToArray (StreamMap __iruid_4122 (Let __iruid_4123 (GetField oldTableCtx (Ref __iruid_4107)) (Let __iruid_4124 (GetField partitionBound (Ref __iruid_4123)) (StreamTakeWhile __iruid_4125 (StreamDropWhile __iruid_4126 (StreamFlatMap __iruid_4127 (ToStream True (GetField oldContexts (Ref __iruid_4123))) (StreamZip -1 AssertSameLength (__iruid_4128 __iruid_4129) (StreamMap __iruid_4130 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_4127))) (Let __iruid_4131 (ToArray (StreamMap __iruid_4132 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4130))) (Let __iruid_4133 (InsertFields (SelectFields () (Ref __iruid_4132)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_4132))))) (If (IsNA (Ref __iruid_4133)) (Literal Struct{__gt:Int32} ) (Ref __iruid_4133))))) (Let __iruid_4134 (StreamAgg __iruid_4135 (StreamFilter __iruid_4136 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4131)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_4131) (Ref __iruid_4136))))) (InsertFields (SelectFields () (Ref __iruid_4130)) None (__mean_gt (AggLet __iruid_4137 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_4131) (Ref __iruid_4135)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_4137))) (Let __iruid_4138 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_4137)))))) (Cast Float64 (Ref __iruid_4138)))))))) (InsertFields (Ref __iruid_4134) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_4139 (ToStream False (Ref __iruid_4131)) (InsertFields (SelectFields () (Ref __iruid_4139)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_4139))) (GetField __mean_gt (Ref __iruid_4134)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_4127)) (GetField mwStopIdx (Ref __iruid_4127)) (I32 1)) (InsertFields (Ref __iruid_4128) None (__iruid_2764 (Ref __iruid_4129))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4126)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4124)) (Apply -1 includesStart () Boolean (Ref __iruid_4124)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4125)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4124)) (Apply -1 includesEnd () Boolean (Ref __iruid_4124)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_4122))) (rowOfData (ToArray (StreamMap __iruid_4140 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4122)) (Ref __iruid_4120) (ApplyBinaryPrimOp Add (Ref __iruid_4120) (GetField blockSize (Ref __iruid_4107))) (I32 1))) (GetField __uid_6 (Ref __iruid_4140)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_4107))) (blockColIdx (GetField blockColIdx (Ref __iruid_4107))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_4141 (ToStream False (Ref __iruid_4121)) (ToStream False (GetField rowOfData (Ref __iruid_4141)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_4121))) (Cast Int64 (GetField blockSize (Ref __iruid_4107)))) (True))))))) (Let __iruid_4142 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_4119)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_4119)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_4142) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_4119)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_4142))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_4106))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_4104)) (GetTupleElement 0 (Ref __iruid_4105))))) (GetTupleElement 1 (Ref __iruid_4103)))))) 2023-04-22 21:14:23.531 BlockManagerInfo: INFO: Removed broadcast_151_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 152.0 B, free: 25.3 GiB) 2023-04-22 21:14:23.587 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 247: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_4223 (ToStream False (ArraySort __iruid_4224 __iruid_4225 (StreamFlatMap __iruid_4226 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_4227 __iruid_4228 (StreamFlatMap __iruid_4229 (StreamZip -1 AssertSameLength (__iruid_4230 __iruid_4231) (Let __iruid_4232 (ToArray (StreamZip -1 AssertSameLength (__iruid_4233 __iruid_4234 __iruid_4235) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_4233)) (mwStartIdx (Cast Int32 (Ref __iruid_4234))) (mwStopIdx (Cast Int32 (Ref __iruid_4235)))))) (StreamMap __iruid_4236 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_4236))) (oldContexts (ToArray (StreamMap __iruid_4237 (ToStream False (GetField parentPartitions (Ref __iruid_4236))) (ArrayRef -1 (Ref __iruid_4232) (Ref __iruid_4237)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_4238 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_4230)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_4238) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_4238) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_4238)) (blockRowIdx (Ref __iruid_4231))))) (Ref __iruid_4229)) (Literal Struct{} ) (ToArray (StreamMap __iruid_4239 (Let __iruid_4240 (GetField blockStart (Ref __iruid_4227)) (Let __iruid_4241 (ToArray (StreamMap __iruid_4242 (Let __iruid_4243 (GetField oldTableCtx (Ref __iruid_4227)) (Let __iruid_4244 (GetField partitionBound (Ref __iruid_4243)) (StreamTakeWhile __iruid_4245 (StreamDropWhile __iruid_4246 (StreamFlatMap __iruid_4247 (ToStream True (GetField oldContexts (Ref __iruid_4243))) (StreamZip -1 AssertSameLength (__iruid_4248 __iruid_4249) (StreamMap __iruid_4250 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_4247))) (Let __iruid_4251 (ToArray (StreamMap __iruid_4252 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4250))) (Let __iruid_4253 (InsertFields (SelectFields () (Ref __iruid_4252)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_4252))))) (If (IsNA (Ref __iruid_4253)) (Literal Struct{__gt:Int32} ) (Ref __iruid_4253))))) (Let __iruid_4254 (StreamAgg __iruid_4255 (StreamFilter __iruid_4256 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4251)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_4251) (Ref __iruid_4256))))) (InsertFields (SelectFields () (Ref __iruid_4250)) None (__mean_gt (AggLet __iruid_4257 False (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_4251) (Ref __iruid_4255)))) (ApplyBinaryPrimOp FloatingPointDivide (ApplyAggOp Sum () ((Ref __iruid_4257))) (Let __iruid_4258 (ApplyAggOp Sum () ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_4257)))))) (Cast Float64 (Ref __iruid_4258)))))))) (InsertFields (Ref __iruid_4254) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_4259 (ToStream False (Ref __iruid_4251)) (InsertFields (SelectFields () (Ref __iruid_4259)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_4259))) (GetField __mean_gt (Ref __iruid_4254)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_4247)) (GetField mwStopIdx (Ref __iruid_4247)) (I32 1)) (InsertFields (Ref __iruid_4248) None (__iruid_2764 (Ref __iruid_4249))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4246)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4244)) (Apply -1 includesStart () Boolean (Ref __iruid_4244)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4245)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4244)) (Apply -1 includesEnd () Boolean (Ref __iruid_4244)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_4242))) (rowOfData (ToArray (StreamMap __iruid_4260 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4242)) (Ref __iruid_4240) (ApplyBinaryPrimOp Add (Ref __iruid_4240) (GetField blockSize (Ref __iruid_4227))) (I32 1))) (GetField __uid_6 (Ref __iruid_4260)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_4227))) (blockColIdx (GetField blockColIdx (Ref __iruid_4227))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_4261 (ToStream False (Ref __iruid_4241)) (ToStream False (GetField rowOfData (Ref __iruid_4261)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_4241))) (Cast Int64 (GetField blockSize (Ref __iruid_4227)))) (True))))))) (Let __iruid_4262 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_4239)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_4239)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_4262) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_4239)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_4262))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_4226))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_4224)) (GetTupleElement 0 (Ref __iruid_4225))))) (GetTupleElement 1 (Ref __iruid_4223)))))) 2023-04-22 21:14:23.619 : INFO: after LowerArrayAggsToRunAggs: IR size 261: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_4223 (ToStream False (ArraySort __iruid_4224 __iruid_4225 (StreamFlatMap __iruid_4226 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_4227 __iruid_4228 (StreamFlatMap __iruid_4229 (StreamZip -1 AssertSameLength (__iruid_4230 __iruid_4231) (Let __iruid_4232 (ToArray (StreamZip -1 AssertSameLength (__iruid_4233 __iruid_4234 __iruid_4235) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_4233)) (mwStartIdx (Cast Int32 (Ref __iruid_4234))) (mwStopIdx (Cast Int32 (Ref __iruid_4235)))))) (StreamMap __iruid_4236 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_4236))) (oldContexts (ToArray (StreamMap __iruid_4237 (ToStream False (GetField parentPartitions (Ref __iruid_4236))) (ArrayRef -1 (Ref __iruid_4232) (Ref __iruid_4237)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_4238 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_4230)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_4238) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_4238) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_4238)) (blockRowIdx (Ref __iruid_4231))))) (Ref __iruid_4229)) (Literal Struct{} ) (ToArray (StreamMap __iruid_4239 (Let __iruid_4240 (GetField blockStart (Ref __iruid_4227)) (Let __iruid_4241 (ToArray (StreamMap __iruid_4242 (Let __iruid_4243 (GetField oldTableCtx (Ref __iruid_4227)) (Let __iruid_4244 (GetField partitionBound (Ref __iruid_4243)) (StreamTakeWhile __iruid_4245 (StreamDropWhile __iruid_4246 (StreamFlatMap __iruid_4247 (ToStream True (GetField oldContexts (Ref __iruid_4243))) (StreamZip -1 AssertSameLength (__iruid_4248 __iruid_4249) (StreamMap __iruid_4250 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_4247))) (Let __iruid_4251 (ToArray (StreamMap __iruid_4252 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4250))) (Let __iruid_4253 (InsertFields (SelectFields () (Ref __iruid_4252)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_4252))))) (If (IsNA (Ref __iruid_4253)) (Literal Struct{__gt:Int32} ) (Ref __iruid_4253))))) (Let __iruid_4254 (Let __iruid_4263 (RunAgg ((TypedStateSig +PFloat64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PFloat64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_4255 (StreamFilter __iruid_4256 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4251)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_4251) (Ref __iruid_4256))))) (Let __iruid_4257 (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_4251) (Ref __iruid_4255)))) (Begin (SeqOp 0 (Sum (TypedStateSig +PFloat64)) ((Ref __iruid_4257))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_4257)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PFloat64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (InsertFields (SelectFields () (Ref __iruid_4250)) None (__mean_gt (ApplyBinaryPrimOp FloatingPointDivide (GetTupleElement 0 (Ref __iruid_4263)) (Let __iruid_4258 (GetTupleElement 1 (Ref __iruid_4263)) (Cast Float64 (Ref __iruid_4258))))))) (InsertFields (Ref __iruid_4254) ("__mean_gt" "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_4259 (ToStream False (Ref __iruid_4251)) (InsertFields (SelectFields () (Ref __iruid_4259)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_4259))) (GetField __mean_gt (Ref __iruid_4254)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_4247)) (GetField mwStopIdx (Ref __iruid_4247)) (I32 1)) (InsertFields (Ref __iruid_4248) None (__iruid_2764 (Ref __iruid_4249))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4246)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4244)) (Apply -1 includesStart () Boolean (Ref __iruid_4244)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4245)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4244)) (Apply -1 includesEnd () Boolean (Ref __iruid_4244)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_4242))) (rowOfData (ToArray (StreamMap __iruid_4260 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4242)) (Ref __iruid_4240) (ApplyBinaryPrimOp Add (Ref __iruid_4240) (GetField blockSize (Ref __iruid_4227))) (I32 1))) (GetField __uid_6 (Ref __iruid_4260)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_4227))) (blockColIdx (GetField blockColIdx (Ref __iruid_4227))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_4261 (ToStream False (Ref __iruid_4241)) (ToStream False (GetField rowOfData (Ref __iruid_4261)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_4241))) (Cast Int64 (GetField blockSize (Ref __iruid_4227)))) (True))))))) (Let __iruid_4262 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_4239)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_4239)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_4262) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_4239)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_4262))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_4226))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_4224)) (GetTupleElement 0 (Ref __iruid_4225))))) (GetTupleElement 1 (Ref __iruid_4223)))))) 2023-04-22 21:14:23.662 : INFO: Prune: InsertFields: eliminating field '__mean_gt' 2023-04-22 21:14:23.679 BlockManagerInfo: INFO: Removed broadcast_159_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 17.5 KiB, free: 25.3 GiB) 2023-04-22 21:14:23.733 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 256: (WriteMetadata "{\"name\":\"RelationalWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"overwrite\":false}" (WriteMetadata "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"stageLocally\":false,\"typ\":{\"elementType\":\"Float64\",\"shape\":[114591,4151],\"isRowVector\":false,\"blockSize\":4096,\"sparsity\":{}}}" (ToArray (StreamMap __iruid_4349 (ToStream False (ArraySort __iruid_4350 __iruid_4351 (StreamFlatMap __iruid_4352 (ToStream False (CollectDistributedArray matrix_block_matrix_writer __iruid_4353 __iruid_4354 (StreamFlatMap __iruid_4355 (StreamZip -1 AssertSameLength (__iruid_4356 __iruid_4357) (Let __iruid_4358 (ToArray (StreamZip -1 AssertSameLength (__iruid_4359 __iruid_4360 __iruid_4361) (ToStream False (Literal Array[Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}] )) (ToStream False (Literal Array[Int64] )) (ToStream False (Literal Array[Int64] )) (MakeStruct (mwOld (Ref __iruid_4359)) (mwStartIdx (Cast Int32 (Ref __iruid_4360))) (mwStopIdx (Cast Int32 (Ref __iruid_4361)))))) (StreamMap __iruid_4362 (ToStream False (Literal Array[Struct{partitionBound:Interval[Tuple[Struct{__iruid_2764:Int32},Int32]],parentPartitions:Array[Int32]}] )) (MakeStruct (partitionBound (GetField partitionBound (Ref __iruid_4362))) (oldContexts (ToArray (StreamMap __iruid_4363 (ToStream False (GetField parentPartitions (Ref __iruid_4362))) (ArrayRef -1 (Ref __iruid_4358) (Ref __iruid_4363)))))))) (StreamRange -1 False (I32 0) (I32 28) (I32 1)) (StreamMap __iruid_4364 (StreamRange -1 False (I32 0) (I32 2) (I32 1)) (MakeStruct (oldTableCtx (Ref __iruid_4356)) (blockStart (ApplyBinaryPrimOp Multiply (Ref __iruid_4364) (I32 4096))) (blockSize (If (ApplyComparisonOp EQWithNA (Ref __iruid_4364) (I32 1)) (I32 55) (I32 4096))) (blockColIdx (Ref __iruid_4364)) (blockRowIdx (Ref __iruid_4357))))) (Ref __iruid_4355)) (Literal Struct{} ) (ToArray (StreamMap __iruid_4365 (Let __iruid_4366 (GetField blockStart (Ref __iruid_4353)) (Let __iruid_4367 (ToArray (StreamMap __iruid_4368 (Let __iruid_4369 (GetField oldTableCtx (Ref __iruid_4353)) (Let __iruid_4370 (GetField partitionBound (Ref __iruid_4369)) (StreamTakeWhile __iruid_4371 (StreamDropWhile __iruid_4372 (StreamFlatMap __iruid_4373 (ToStream True (GetField oldContexts (Ref __iruid_4369))) (StreamZip -1 AssertSameLength (__iruid_4374 __iruid_4375) (StreamMap __iruid_4376 (ReadPartition Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]} "{\"category\":\"PartitionIteratorLongReader\",\"fullRowType\":\"Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64,__row_uid:Tuple[Int64,Int64],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{GT:Call}]}\",\"uidFieldName\":\"__row_uid\",\"contextType\":\"Struct{bed:String,start:Int32,end:Int32,partitionIndex:Int32}\"}" (GetField mwOld (Ref __iruid_4373))) (Let __iruid_4377 (ToArray (StreamMap __iruid_4378 (ToStream False (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4376))) (Let __iruid_4379 (InsertFields (SelectFields () (Ref __iruid_4378)) None (__gt (Apply 13 nNonRefAlleles () Int32 (GetField GT (Ref __iruid_4378))))) (If (IsNA (Ref __iruid_4379)) (Literal Struct{__gt:Int32} ) (Ref __iruid_4379))))) (Let __iruid_4380 (RunAgg ((TypedStateSig +PFloat64) (TypedStateSig +PInt64)) (Begin (Begin (InitOp 0 (Sum (TypedStateSig +PFloat64)) ()) (InitOp 1 (Sum (TypedStateSig +PInt64)) ())) (StreamFor __iruid_4381 (StreamFilter __iruid_4382 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4377)) (I32 1)) (ApplyUnaryPrimOp Bang (IsNA (ArrayRef -1 (Ref __iruid_4377) (Ref __iruid_4382))))) (Let __iruid_4383 (Cast Float64 (GetField __gt (ArrayRef -1 (Ref __iruid_4377) (Ref __iruid_4381)))) (Begin (SeqOp 0 (Sum (TypedStateSig +PFloat64)) ((Ref __iruid_4383))) (SeqOp 1 (Sum (TypedStateSig +PInt64)) ((Apply 15 toInt64 () Int64 (ApplyUnaryPrimOp Bang (IsNA (Ref __iruid_4383)))))))))) (MakeTuple (0 1) (ResultOp 0 (Sum (TypedStateSig +PFloat64))) (ResultOp 1 (Sum (TypedStateSig +PInt64))))) (Let __iruid_4384 (ApplyBinaryPrimOp FloatingPointDivide (GetTupleElement 0 (Ref __iruid_4380)) (Cast Float64 (GetTupleElement 1 (Ref __iruid_4380)))) (InsertFields (SelectFields () (Ref __iruid_4376)) ( "the entries! [877f12a8827e18f61222c6c8c5fb04a8]") (`the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (ToArray (StreamMap __iruid_4385 (ToStream False (Ref __iruid_4377)) (InsertFields (SelectFields () (Ref __iruid_4385)) None (__uid_6 (Coalesce (Cast Float64 (GetField __gt (Ref __iruid_4385))) (Ref __iruid_4384)))))))))))) (StreamRange -1 False (GetField mwStartIdx (Ref __iruid_4373)) (GetField mwStopIdx (Ref __iruid_4373)) (I32 1)) (InsertFields (Ref __iruid_4374) None (__iruid_2764 (Ref __iruid_4375))))) (Apply -1 pointLessThanPartitionIntervalLeftEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4372)) (ApplySpecial -1 start () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4370)) (Apply -1 includesStart () Boolean (Ref __iruid_4370)))) (Apply -1 pointLessThanPartitionIntervalRightEndpoint () Boolean (SelectFields (__iruid_2764) (Ref __iruid_4371)) (ApplySpecial -1 end () Tuple[Struct{__iruid_2764:Int32},Int32] (Ref __iruid_4370)) (Apply -1 includesEnd () Boolean (Ref __iruid_4370)))))) (MakeStruct (__iruid_2764 (GetField __iruid_2764 (Ref __iruid_4368))) (rowOfData (ToArray (StreamMap __iruid_4386 (ToStream False (ArraySlice (GetField `the entries! [877f12a8827e18f61222c6c8c5fb04a8]` (Ref __iruid_4368)) (Ref __iruid_4366) (ApplyBinaryPrimOp Add (Ref __iruid_4366) (GetField blockSize (Ref __iruid_4353))) (I32 1))) (GetField __uid_6 (Ref __iruid_4386)))))))) (MakeStream Stream[Struct{blockRowIdx:Int32,blockColIdx:Int32,ndBlock:NDArray[Float64,2]}] False (MakeStruct (blockRowIdx (GetField blockRowIdx (Ref __iruid_4353))) (blockColIdx (GetField blockColIdx (Ref __iruid_4353))) (ndBlock (MakeNDArray -1 (StreamFlatMap __iruid_4387 (ToStream False (Ref __iruid_4367)) (ToStream False (GetField rowOfData (Ref __iruid_4387)))) (MakeTuple (0 1) (Cast Int64 (ArrayLen (Ref __iruid_4367))) (Cast Int64 (GetField blockSize (Ref __iruid_4353)))) (True))))))) (Let __iruid_4388 (ApplyBinaryPrimOp Add (GetField blockRowIdx (Ref __iruid_4365)) (ApplyBinaryPrimOp Multiply (GetField blockColIdx (Ref __iruid_4365)) (I32 28))) (MakeTuple (0 1) (Ref __iruid_4388) (WriteValue "{\"name\":\"TypedCodecSpec\",\"_eType\":\"+ENDArray[EFloat64]\",\"_vType\":\"NDArray[Float64,2]\",\"_bufferSpec\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}" (GetField ndBlock (Ref __iruid_4365)) (Apply -1 concat () String (Apply -1 concat () String (Apply -1 concat () String (Str "/fg/saxena...") (Apply -1 str () String (Ref __iruid_4388))) (Str "-")) (UUID4 __iruid_2789))))))) (NA String))) (ToStream False (Ref __iruid_4352))) (ApplyComparisonOp LT (GetTupleElement 0 (Ref __iruid_4350)) (GetTupleElement 0 (Ref __iruid_4351))))) (GetTupleElement 1 (Ref __iruid_4349)))))) 2023-04-22 21:14:23.795 BlockManagerInfo: INFO: Removed broadcast_148_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:14:23.876 : INFO: encoder cache hit 2023-04-22 21:14:23.876 MemoryStore: INFO: Block broadcast_191 stored as values in memory (estimated size 200.0 B, free 25.1 GiB) 2023-04-22 21:14:23.878 MemoryStore: INFO: Block broadcast_191_piece0 stored as bytes in memory (estimated size 155.0 B, free 25.1 GiB) 2023-04-22 21:14:23.878 BlockManagerInfo: INFO: Added broadcast_191_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 155.0 B, free: 25.3 GiB) 2023-04-22 21:14:23.879 SparkContext: INFO: Created broadcast 191 from broadcast at SparkBackend.scala:354 2023-04-22 21:14:23.879 : INFO: instruction count: 3: __C3537HailClassLoaderContainer. 2023-04-22 21:14:23.879 : INFO: instruction count: 3: __C3537HailClassLoaderContainer. 2023-04-22 21:14:23.879 : INFO: instruction count: 3: __C3539FSContainer. 2023-04-22 21:14:23.879 : INFO: instruction count: 3: __C3539FSContainer. 2023-04-22 21:14:23.907 BlockManagerInfo: INFO: Removed broadcast_164_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:14:23.944 : INFO: instruction count: 3: __C3541collect_distributed_array_matrix_block_matrix_writer. 2023-04-22 21:14:23.944 : INFO: instruction count: 111: __C3541collect_distributed_array_matrix_block_matrix_writer.apply 2023-04-22 21:14:23.945 : INFO: instruction count: 17: __C3541collect_distributed_array_matrix_block_matrix_writer.apply 2023-04-22 21:14:23.945 : INFO: instruction count: 27: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3543DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:14:23.945 : INFO: instruction count: 53: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3544INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:23.945 : INFO: instruction count: 26: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3545INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDEND_TO_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDEND 2023-04-22 21:14:23.945 : INFO: instruction count: 44: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3546INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolEND_TO_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:23.945 : INFO: instruction count: 26: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3547INPLACE_DECODE_r_struct_of_r_struct_of_r_int32ENDANDr_int32END_TO_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:23.945 : INFO: instruction count: 17: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3548INPLACE_DECODE_r_struct_of_r_int32END_TO_r_struct_of_r_int32END 2023-04-22 21:14:23.945 : INFO: instruction count: 10: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3549INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:14:23.945 : INFO: instruction count: 10: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3550INPLACE_DECODE_r_bool_TO_r_bool 2023-04-22 21:14:23.945 : INFO: instruction count: 58: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3551INPLACE_DECODE_r_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:23.945 : INFO: instruction count: 35: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3552INPLACE_DECODE_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END_TO_r_struct_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:23.945 : INFO: instruction count: 44: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3553INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:23.945 : INFO: instruction count: 31: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3554INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:14:23.945 : INFO: instruction count: 27: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3556DECODE_r_struct_of_r_struct_of_ENDEND_TO_SBaseStructPointer 2023-04-22 21:14:23.945 : INFO: instruction count: 8: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3557INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:14:23.946 : INFO: instruction count: 483: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3559split_ToArray 2023-04-22 21:14:23.946 : INFO: instruction count: 699: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3562split_ToArray 2023-04-22 21:14:23.947 : INFO: instruction count: 256: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3583split_ToArray 2023-04-22 21:14:23.947 : INFO: instruction count: 8: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3591nNonRefAlleles 2023-04-22 21:14:23.947 : INFO: instruction count: 9: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3606begin_group_0 2023-04-22 21:14:23.947 : INFO: instruction count: 17: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3607begin_group_0 2023-04-22 21:14:23.947 : INFO: instruction count: 158: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3608split_StreamFor 2023-04-22 21:14:23.947 : INFO: instruction count: 35: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3616arrayref_bounds_check 2023-04-22 21:14:23.947 : INFO: instruction count: 70: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3620begin_group_0 2023-04-22 21:14:23.947 : INFO: instruction count: 5: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3623toInt64 2023-04-22 21:14:23.947 : INFO: instruction count: 170: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3630split_ToArray 2023-04-22 21:14:23.947 : INFO: instruction count: 2: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3664includesStart 2023-04-22 21:14:23.947 : INFO: instruction count: 111: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3665pointLessThanPartitionIntervalLeftEndpoint 2023-04-22 21:14:23.947 : INFO: instruction count: 11: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3666ord_compare 2023-04-22 21:14:23.947 : INFO: instruction count: 8: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3667ord_compareNonnull 2023-04-22 21:14:23.948 : INFO: instruction count: 2: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3673includesEnd 2023-04-22 21:14:23.948 : INFO: instruction count: 102: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3674pointLessThanPartitionIntervalRightEndpoint 2023-04-22 21:14:23.948 : INFO: instruction count: 364: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3680split_ToArray 2023-04-22 21:14:23.948 : INFO: instruction count: 12: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3693setup_jab 2023-04-22 21:14:23.948 : INFO: instruction count: 18: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3724str 2023-04-22 21:14:23.948 : INFO: instruction count: 29: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3725concat 2023-04-22 21:14:23.948 : INFO: instruction count: 29: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3726concat 2023-04-22 21:14:23.948 : INFO: instruction count: 68: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3731ENCODE_SNDArrayPointer_TO_r_ndarray_of_o_float64 2023-04-22 21:14:23.948 : INFO: instruction count: 4: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3732ENCODE_SFloat64$_TO_o_float64 2023-04-22 21:14:23.948 : INFO: instruction count: 25: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3736ENCODE_SBaseStructPointer_TO_r_struct_of_r_array_of_r_struct_of_r_int32ANDr_binaryENDEND 2023-04-22 21:14:23.948 : INFO: instruction count: 35: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3737ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_int32ANDr_binaryEND 2023-04-22 21:14:23.948 : INFO: instruction count: 25: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3738ENCODE_SBaseStructPointer_TO_r_struct_of_r_int32ANDr_binaryEND 2023-04-22 21:14:23.948 : INFO: instruction count: 4: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3739ENCODE_SInt32$_TO_r_int32 2023-04-22 21:14:23.948 : INFO: instruction count: 16: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3740ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:14:23.949 : INFO: instruction count: 9: __C3541collect_distributed_array_matrix_block_matrix_writer.setPartitionIndex 2023-04-22 21:14:23.949 : INFO: instruction count: 4: __C3541collect_distributed_array_matrix_block_matrix_writer.addPartitionRegion 2023-04-22 21:14:23.949 : INFO: instruction count: 4: __C3541collect_distributed_array_matrix_block_matrix_writer.setPool 2023-04-22 21:14:23.949 : INFO: instruction count: 3: __C3541collect_distributed_array_matrix_block_matrix_writer.addHailClassLoader 2023-04-22 21:14:23.949 : INFO: instruction count: 3: __C3541collect_distributed_array_matrix_block_matrix_writer.addFS 2023-04-22 21:14:23.949 : INFO: instruction count: 4: __C3541collect_distributed_array_matrix_block_matrix_writer.addTaskContext 2023-04-22 21:14:23.949 : INFO: instruction count: 3: __C3541collect_distributed_array_matrix_block_matrix_writer.setObjects 2023-04-22 21:14:23.949 : INFO: instruction count: 67: __C3541collect_distributed_array_matrix_block_matrix_writer.addAndDecodeLiterals 2023-04-22 21:14:23.949 : INFO: instruction count: 45: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3744DECODE_r_struct_of_r_binaryANDr_struct_of_o_int32ENDANDr_binaryEND_TO_SBaseStructPointer 2023-04-22 21:14:23.949 : INFO: instruction count: 48: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3745INPLACE_DECODE_r_struct_of_o_int32END_TO_r_struct_of_o_int32END 2023-04-22 21:14:23.949 : INFO: instruction count: 10: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3746INPLACE_DECODE_o_int32_TO_o_int32 2023-04-22 21:14:23.949 : INFO: instruction count: 384: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3559split_ToArray_region20_22 2023-04-22 21:14:23.949 : INFO: instruction count: 216: __C3541collect_distributed_array_matrix_block_matrix_writer.__m3562split_ToArray_region7_14 2023-04-22 21:14:23.950 : INFO: instruction count: 3: __C3747__m3559split_ToArraySpills. 2023-04-22 21:14:23.950 : INFO: instruction count: 3: __C3761__m3562split_ToArraySpills. 2023-04-22 21:14:23.950 : INFO: instruction count: 3: __C3656staticWrapperClass_1. 2023-04-22 21:14:23.987 : INFO: encoder cache hit 2023-04-22 21:14:23.988 MemoryStore: INFO: Block broadcast_192 stored as values in memory (estimated size 1584.0 B, free 25.1 GiB) 2023-04-22 21:14:23.989 MemoryStore: INFO: Block broadcast_192_piece0 stored as bytes in memory (estimated size 719.0 B, free 25.1 GiB) 2023-04-22 21:14:23.992 BlockManagerInfo: INFO: Added broadcast_192_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 719.0 B, free: 25.3 GiB) 2023-04-22 21:14:24.001 SparkContext: INFO: Created broadcast 192 from broadcast at SparkBackend.scala:354 2023-04-22 21:14:24.001 : INFO: instruction count: 3: __C3454HailClassLoaderContainer. 2023-04-22 21:14:24.001 : INFO: instruction count: 3: __C3454HailClassLoaderContainer. 2023-04-22 21:14:24.001 : INFO: instruction count: 3: __C3456FSContainer. 2023-04-22 21:14:24.001 : INFO: instruction count: 3: __C3456FSContainer. 2023-04-22 21:14:24.010 BlockManagerInfo: INFO: Removed broadcast_170_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 368.6 KiB, free: 25.3 GiB) 2023-04-22 21:14:24.027 : INFO: instruction count: 3: __C3458Compiled. 2023-04-22 21:14:24.027 : INFO: instruction count: 107: __C3458Compiled.apply 2023-04-22 21:14:24.028 : INFO: instruction count: 444: __C3458Compiled.__m3460split_ToArray 2023-04-22 21:14:24.028 : INFO: instruction count: 424: __C3458Compiled.__m3462split_ToArray 2023-04-22 21:14:24.028 : INFO: instruction count: 177: __C3458Compiled.__m3499split_ToArray 2023-04-22 21:14:24.028 : INFO: instruction count: 35: __C3458Compiled.__m3507arrayref_bounds_check 2023-04-22 21:14:24.028 : INFO: instruction count: 11: __C3458Compiled.__m3528ord_equiv 2023-04-22 21:14:24.028 : INFO: instruction count: 14: __C3458Compiled.__m3529ord_equivNonnull 2023-04-22 21:14:24.028 : INFO: instruction count: 4: __C3458Compiled.setBackend 2023-04-22 21:14:24.028 : INFO: instruction count: 9: __C3458Compiled.__m3767ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32ENDEND 2023-04-22 21:14:24.029 : INFO: instruction count: 57: __C3458Compiled.__m3768ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDENDANDr_int32ANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:24.029 : INFO: instruction count: 51: __C3458Compiled.__m3769ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32ENDEND 2023-04-22 21:14:24.029 : INFO: instruction count: 41: __C3458Compiled.__m3770ENCODE_SIntervalPointer_TO_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolEND 2023-04-22 21:14:24.029 : INFO: instruction count: 21: __C3458Compiled.__m3771ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:24.029 : INFO: instruction count: 13: __C3458Compiled.__m3772ENCODE_SBaseStructPointer_TO_r_struct_of_r_int32END 2023-04-22 21:14:24.029 : INFO: instruction count: 4: __C3458Compiled.__m3773ENCODE_SInt32$_TO_r_int32 2023-04-22 21:14:24.029 : INFO: instruction count: 4: __C3458Compiled.__m3774ENCODE_SBoolean$_TO_r_bool 2023-04-22 21:14:24.029 : INFO: instruction count: 35: __C3458Compiled.__m3775ENCODE_SIndexablePointer_TO_r_array_of_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:24.029 : INFO: instruction count: 33: __C3458Compiled.__m3776ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_int32ANDr_int32END 2023-04-22 21:14:24.029 : INFO: instruction count: 49: __C3458Compiled.__m3777ENCODE_SBaseStructPointer_TO_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:24.029 : INFO: instruction count: 16: __C3458Compiled.__m3778ENCODE_SStringPointer_TO_r_binary 2023-04-22 21:14:24.029 : INFO: instruction count: 9: __C3458Compiled.__m3779ENCODE_SBaseStructPointer_TO_r_struct_of_r_struct_of_ENDEND 2023-04-22 21:14:24.029 : INFO: instruction count: 1: __C3458Compiled.__m3780ENCODE_SBaseStructPointer_TO_r_struct_of_END 2023-04-22 21:14:24.029 : INFO: instruction count: 27: __C3458Compiled.__m3783DECODE_r_struct_of_r_array_of_r_struct_of_r_int32ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:14:24.029 : INFO: instruction count: 58: __C3458Compiled.__m3784INPLACE_DECODE_r_array_of_r_struct_of_r_int32ANDr_binaryEND_TO_r_array_of_r_tuple_of_r_int32ANDr_stringEND 2023-04-22 21:14:24.029 : INFO: instruction count: 26: __C3458Compiled.__m3785INPLACE_DECODE_r_struct_of_r_int32ANDr_binaryEND_TO_r_tuple_of_r_int32ANDr_stringEND 2023-04-22 21:14:24.029 : INFO: instruction count: 10: __C3458Compiled.__m3786INPLACE_DECODE_r_int32_TO_r_int32 2023-04-22 21:14:24.029 : INFO: instruction count: 31: __C3458Compiled.__m3787INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:14:24.029 : INFO: instruction count: 12: __C3458Compiled.__m3806setup_jab 2023-04-22 21:14:24.029 : INFO: instruction count: 47: __C3458Compiled.__m3809dependent_sorting_func 2023-04-22 21:14:24.029 : INFO: instruction count: 11: __C3458Compiled.__m3812ord_lt 2023-04-22 21:14:24.029 : INFO: instruction count: 14: __C3458Compiled.__m3813ord_ltNonnull 2023-04-22 21:14:24.030 : INFO: instruction count: 202: __C3458Compiled.__m3814arraySorter_outer 2023-04-22 21:14:24.041 : INFO: instruction count: 66: __C3458Compiled.__m3815arraySorter_merge 2023-04-22 21:14:24.041 : INFO: instruction count: 36: __C3458Compiled.__m3816arraySorter_splitMerge 2023-04-22 21:14:24.041 : INFO: instruction count: 9: __C3458Compiled.setPartitionIndex 2023-04-22 21:14:24.041 : INFO: instruction count: 4: __C3458Compiled.addPartitionRegion 2023-04-22 21:14:24.041 : INFO: instruction count: 4: __C3458Compiled.setPool 2023-04-22 21:14:24.041 : INFO: instruction count: 3: __C3458Compiled.addHailClassLoader 2023-04-22 21:14:24.041 : INFO: instruction count: 3: __C3458Compiled.addFS 2023-04-22 21:14:24.041 : INFO: instruction count: 4: __C3458Compiled.addTaskContext 2023-04-22 21:14:24.041 : INFO: instruction count: 3: __C3458Compiled.setObjects 2023-04-22 21:14:24.041 : INFO: instruction count: 133: __C3458Compiled.addAndDecodeLiterals 2023-04-22 21:14:24.041 : INFO: instruction count: 63: __C3458Compiled.__m3831DECODE_r_struct_of_r_struct_of_ENDANDr_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32ENDANDr_array_of_r_int64ANDr_array_of_r_int64ANDr_array_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32ENDEND_TO_SBaseStructPointer 2023-04-22 21:14:24.041 : INFO: instruction count: 8: __C3458Compiled.__m3832INPLACE_DECODE_r_struct_of_END_TO_r_struct_of_END 2023-04-22 21:14:24.041 : INFO: instruction count: 58: __C3458Compiled.__m3833INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_array_of_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:24.041 : INFO: instruction count: 44: __C3458Compiled.__m3834INPLACE_DECODE_r_struct_of_r_binaryANDr_int32ANDr_int32ANDr_int32END_TO_r_struct_of_r_stringANDr_int32ANDr_int32ANDr_int32END 2023-04-22 21:14:24.042 : INFO: instruction count: 58: __C3458Compiled.__m3835INPLACE_DECODE_r_array_of_r_int64_TO_r_array_of_r_int64 2023-04-22 21:14:24.042 : INFO: instruction count: 10: __C3458Compiled.__m3836INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:14:24.042 : INFO: instruction count: 58: __C3458Compiled.__m3837INPLACE_DECODE_r_array_of_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32END_TO_r_array_of_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_int32END 2023-04-22 21:14:24.042 : INFO: instruction count: 26: __C3458Compiled.__m3838INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolENDANDr_array_of_r_int32END_TO_r_struct_of_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_array_of_r_int32END 2023-04-22 21:14:24.042 : INFO: instruction count: 44: __C3458Compiled.__m3839INPLACE_DECODE_r_struct_of_r_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_struct_of_r_struct_of_r_int32ENDANDr_int32ENDANDr_boolANDr_boolEND_TO_r_interval_of_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:24.042 : INFO: instruction count: 26: __C3458Compiled.__m3840INPLACE_DECODE_r_struct_of_r_struct_of_r_int32ENDANDr_int32END_TO_r_tuple_of_r_struct_of_r_int32ENDANDr_int32END 2023-04-22 21:14:24.042 : INFO: instruction count: 17: __C3458Compiled.__m3841INPLACE_DECODE_r_struct_of_r_int32END_TO_r_struct_of_r_int32END 2023-04-22 21:14:24.042 : INFO: instruction count: 10: __C3458Compiled.__m3842INPLACE_DECODE_r_bool_TO_r_bool 2023-04-22 21:14:24.042 : INFO: instruction count: 58: __C3458Compiled.__m3843INPLACE_DECODE_r_array_of_r_int32_TO_r_array_of_r_int32 2023-04-22 21:14:24.042 : INFO: instruction count: 110: __C3458Compiled.__m3460split_ToArray_region22_24 2023-04-22 21:14:24.043 : INFO: instruction count: 586: __C3458Compiled.__m3460split_ToArray_region16_69 2023-04-22 21:14:24.043 : INFO: instruction count: 3: __C3844__m3460split_ToArraySpills. 2023-04-22 21:14:24.043 : INFO: instruction count: 3: __C3781staticWrapperClass_1. 2023-04-22 21:14:24.091 BlockManagerInfo: INFO: Removed broadcast_172_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 32.4 KiB, free: 25.3 GiB) 2023-04-22 21:14:24.152 BlockManagerInfo: INFO: Removed broadcast_152_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 152.0 B, free: 25.3 GiB) 2023-04-22 21:14:24.193 : INFO: executing D-Array [matrix_block_matrix_writer] with 56 tasks 2023-04-22 21:14:24.193 MemoryStore: INFO: Block broadcast_193 stored as values in memory (estimated size 64.0 B, free 25.1 GiB) 2023-04-22 21:14:24.194 MemoryStore: INFO: Block broadcast_193_piece0 stored as bytes in memory (estimated size 49.0 B, free 25.1 GiB) 2023-04-22 21:14:24.195 BlockManagerInfo: INFO: Added broadcast_193_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 49.0 B, free: 25.3 GiB) 2023-04-22 21:14:24.196 SparkContext: INFO: Created broadcast 193 from broadcast at SparkBackend.scala:354 2023-04-22 21:14:24.197 MemoryStore: INFO: Block broadcast_194 stored as values in memory (estimated size 429.5 KiB, free 25.1 GiB) 2023-04-22 21:14:24.209 MemoryStore: INFO: Block broadcast_194_piece0 stored as bytes in memory (estimated size 32.4 KiB, free 25.1 GiB) 2023-04-22 21:14:24.209 BlockManagerInfo: INFO: Added broadcast_194_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 32.4 KiB, free: 25.3 GiB) 2023-04-22 21:14:24.210 SparkContext: INFO: Created broadcast 194 from broadcast at SparkBackend.scala:354 2023-04-22 21:14:24.268 BlockManagerInfo: INFO: Removed broadcast_158_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 17.5 KiB, free: 25.3 GiB) 2023-04-22 21:14:24.361 SparkContext: INFO: Starting job: collect at SparkBackend.scala:368 2023-04-22 21:14:24.361 DAGScheduler: INFO: Got job 46 (collect at SparkBackend.scala:368) with 56 output partitions 2023-04-22 21:14:24.361 DAGScheduler: INFO: Final stage: ResultStage 85 (collect at SparkBackend.scala:368) 2023-04-22 21:14:24.361 DAGScheduler: INFO: Parents of final stage: List() 2023-04-22 21:14:24.362 DAGScheduler: INFO: Missing parents: List() 2023-04-22 21:14:24.362 DAGScheduler: INFO: Submitting ResultStage 85 (SparkBackendComputeRDD[187] at RDD at SparkBackend.scala:784), which has no missing parents 2023-04-22 21:14:24.403 MemoryStore: INFO: Block broadcast_195 stored as values in memory (estimated size 779.9 KiB, free 25.1 GiB) 2023-04-22 21:14:24.407 BlockManagerInfo: INFO: Removed broadcast_173_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 9.5 KiB, free: 25.3 GiB) 2023-04-22 21:14:24.413 MemoryStore: INFO: Block broadcast_195_piece0 stored as bytes in memory (estimated size 393.7 KiB, free 25.1 GiB) 2023-04-22 21:14:24.414 BlockManagerInfo: INFO: Added broadcast_195_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 393.7 KiB, free: 25.3 GiB) 2023-04-22 21:14:24.414 SparkContext: INFO: Created broadcast 195 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:14:24.414 DAGScheduler: INFO: Submitting 56 missing tasks from ResultStage 85 (SparkBackendComputeRDD[187] at RDD at SparkBackend.scala:784) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)) 2023-04-22 21:14:24.414 TaskSchedulerImpl: INFO: Adding task set 85.0 with 56 tasks resource profile 0 2023-04-22 21:14:24.416 TaskSetManager: INFO: Starting task 0.0 in stage 85.0 (TID 448) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:14:24.416 Executor: INFO: Running task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:24.437 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:24.454 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:24.480 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:24.526 BlockManagerInfo: INFO: Removed broadcast_185_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 32.4 KiB, free: 25.3 GiB) 2023-04-22 21:14:24.559 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:24.638 BlockManagerInfo: INFO: Removed broadcast_161_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 17.5 KiB, free: 25.3 GiB) 2023-04-22 21:14:24.687 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:24.739 BlockManagerInfo: INFO: Removed broadcast_186_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 375.3 KiB, free: 25.3 GiB) 2023-04-22 21:14:24.792 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:24.845 BlockManagerInfo: INFO: Removed broadcast_181_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 217.0 B, free: 25.3 GiB) 2023-04-22 21:14:24.941 BlockManagerInfo: INFO: Removed broadcast_165_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 351.3 KiB, free: 25.3 GiB) 2023-04-22 21:14:24.981 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:25.057 BlockManagerInfo: INFO: Removed broadcast_155_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 2.3 MiB, free: 25.3 GiB) 2023-04-22 21:14:25.168 BlockManagerInfo: INFO: Removed broadcast_183_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 217.0 B, free: 25.3 GiB) 2023-04-22 21:14:25.260 BlockManagerInfo: INFO: Removed broadcast_162_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:14:25.264 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:25.339 BlockManagerInfo: INFO: Removed broadcast_184_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 49.0 B, free: 25.3 GiB) 2023-04-22 21:14:25.398 BlockManagerInfo: INFO: Removed broadcast_144_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 419.4 KiB, free: 25.3 GiB) 2023-04-22 21:14:25.480 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:25.497 BlockManagerInfo: INFO: Removed broadcast_176_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 60.0 B, free: 25.3 GiB) 2023-04-22 21:14:25.573 BlockManagerInfo: INFO: Removed broadcast_156_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 17.5 KiB, free: 25.3 GiB) 2023-04-22 21:14:25.635 BlockManagerInfo: INFO: Removed broadcast_146_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:14:25.684 BlockManagerInfo: INFO: Removed broadcast_168_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 368.6 KiB, free: 25.3 GiB) 2023-04-22 21:14:25.786 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:26.322 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:27.351 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:28.030 : INFO: TaskReport: stage=85, partition=0, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=12288, cache hits=8191 2023-04-22 21:14:28.030 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 0.0 in stage 85.0 (TID 448) 2023-04-22 21:14:28.037 Executor: INFO: Finished task 0.0 in stage 85.0 (TID 448). 960 bytes result sent to driver 2023-04-22 21:14:28.049 TaskSetManager: INFO: Starting task 1.0 in stage 85.0 (TID 449) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:14:28.050 TaskSetManager: INFO: Finished task 0.0 in stage 85.0 (TID 448) in 3634 ms on uger-c010.broadinstitute.org (executor driver) (1/56) 2023-04-22 21:14:28.064 Executor: INFO: Running task 1.0 in stage 85.0 (TID 449) 2023-04-22 21:14:28.097 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 1.0 in stage 85.0 (TID 449) 2023-04-22 21:14:28.097 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 1.0 in stage 85.0 (TID 449) 2023-04-22 21:14:28.163 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 1.0 in stage 85.0 (TID 449) 2023-04-22 21:14:28.748 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 1.0 in stage 85.0 (TID 449) 2023-04-22 21:14:29.836 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 1.0 in stage 85.0 (TID 449) 2023-04-22 21:14:30.091 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 1.0 in stage 85.0 (TID 449) 2023-04-22 21:14:30.166 : INFO: TaskReport: stage=85, partition=1, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=4099, cache hits=4096 2023-04-22 21:14:30.166 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 1.0 in stage 85.0 (TID 449) 2023-04-22 21:14:30.166 Executor: INFO: Finished task 1.0 in stage 85.0 (TID 449). 961 bytes result sent to driver 2023-04-22 21:14:30.167 TaskSetManager: INFO: Starting task 2.0 in stage 85.0 (TID 450) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:14:30.167 TaskSetManager: INFO: Finished task 1.0 in stage 85.0 (TID 449) in 2118 ms on uger-c010.broadinstitute.org (executor driver) (2/56) 2023-04-22 21:14:30.168 Executor: INFO: Running task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:30.190 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:30.190 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:31.967 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:31.971 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:31.979 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:31.994 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:32.025 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:32.089 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:32.215 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:32.485 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:32.994 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:34.000 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:34.607 : INFO: TaskReport: stage=85, partition=2, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=16384, cache hits=12287 2023-04-22 21:14:34.607 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 85.0 (TID 450) 2023-04-22 21:14:34.610 Executor: INFO: Finished task 2.0 in stage 85.0 (TID 450). 960 bytes result sent to driver 2023-04-22 21:14:34.634 TaskSetManager: INFO: Starting task 3.0 in stage 85.0 (TID 451) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:14:34.635 TaskSetManager: INFO: Finished task 2.0 in stage 85.0 (TID 450) in 4467 ms on uger-c010.broadinstitute.org (executor driver) (3/56) 2023-04-22 21:14:34.635 Executor: INFO: Running task 3.0 in stage 85.0 (TID 451) 2023-04-22 21:14:34.676 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 3.0 in stage 85.0 (TID 451) 2023-04-22 21:14:34.676 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 3.0 in stage 85.0 (TID 451) 2023-04-22 21:14:36.504 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 3.0 in stage 85.0 (TID 451) 2023-04-22 21:14:36.987 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 3.0 in stage 85.0 (TID 451) 2023-04-22 21:14:37.952 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 3.0 in stage 85.0 (TID 451) 2023-04-22 21:14:38.198 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 3.0 in stage 85.0 (TID 451) 2023-04-22 21:14:38.228 : INFO: TaskReport: stage=85, partition=3, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=8195, cache hits=8192 2023-04-22 21:14:38.228 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 3.0 in stage 85.0 (TID 451) 2023-04-22 21:14:38.228 Executor: INFO: Finished task 3.0 in stage 85.0 (TID 451). 961 bytes result sent to driver 2023-04-22 21:14:38.229 TaskSetManager: INFO: Starting task 4.0 in stage 85.0 (TID 452) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:14:38.229 TaskSetManager: INFO: Finished task 3.0 in stage 85.0 (TID 451) in 3595 ms on uger-c010.broadinstitute.org (executor driver) (4/56) 2023-04-22 21:14:38.229 Executor: INFO: Running task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:38.251 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:38.251 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:41.748 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:41.752 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:41.798 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:41.826 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:41.869 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:42.005 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:42.212 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:42.462 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:43.197 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:44.168 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:44.722 : INFO: TaskReport: stage=85, partition=4, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=20480, cache hits=16383 2023-04-22 21:14:44.722 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 85.0 (TID 452) 2023-04-22 21:14:44.725 Executor: INFO: Finished task 4.0 in stage 85.0 (TID 452). 960 bytes result sent to driver 2023-04-22 21:14:44.737 TaskSetManager: INFO: Starting task 5.0 in stage 85.0 (TID 453) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:14:44.738 TaskSetManager: INFO: Finished task 4.0 in stage 85.0 (TID 452) in 6510 ms on uger-c010.broadinstitute.org (executor driver) (5/56) 2023-04-22 21:14:44.738 Executor: INFO: Running task 5.0 in stage 85.0 (TID 453) 2023-04-22 21:14:44.760 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 5.0 in stage 85.0 (TID 453) 2023-04-22 21:14:44.763 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 5.0 in stage 85.0 (TID 453) 2023-04-22 21:14:48.335 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 5.0 in stage 85.0 (TID 453) 2023-04-22 21:14:48.818 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 5.0 in stage 85.0 (TID 453) 2023-04-22 21:14:49.783 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 5.0 in stage 85.0 (TID 453) 2023-04-22 21:14:50.028 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 5.0 in stage 85.0 (TID 453) 2023-04-22 21:14:50.058 : INFO: TaskReport: stage=85, partition=5, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=12291, cache hits=12288 2023-04-22 21:14:50.058 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 5.0 in stage 85.0 (TID 453) 2023-04-22 21:14:50.059 Executor: INFO: Finished task 5.0 in stage 85.0 (TID 453). 961 bytes result sent to driver 2023-04-22 21:14:50.061 TaskSetManager: INFO: Starting task 6.0 in stage 85.0 (TID 454) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:14:50.061 TaskSetManager: INFO: Finished task 5.0 in stage 85.0 (TID 453) in 5324 ms on uger-c010.broadinstitute.org (executor driver) (6/56) 2023-04-22 21:14:50.061 Executor: INFO: Running task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:50.083 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:50.083 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:55.340 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:55.344 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:55.352 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:55.367 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:55.397 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:55.457 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:55.576 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:55.815 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:56.304 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:57.268 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:57.797 : INFO: TaskReport: stage=85, partition=6, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=24576, cache hits=20479 2023-04-22 21:14:57.797 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 6.0 in stage 85.0 (TID 454) 2023-04-22 21:14:57.805 Executor: INFO: Finished task 6.0 in stage 85.0 (TID 454). 960 bytes result sent to driver 2023-04-22 21:14:57.805 TaskSetManager: INFO: Starting task 7.0 in stage 85.0 (TID 455) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:14:57.805 TaskSetManager: INFO: Finished task 6.0 in stage 85.0 (TID 454) in 7744 ms on uger-c010.broadinstitute.org (executor driver) (7/56) 2023-04-22 21:14:57.806 Executor: INFO: Running task 7.0 in stage 85.0 (TID 455) 2023-04-22 21:14:57.828 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 7.0 in stage 85.0 (TID 455) 2023-04-22 21:14:57.828 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 7.0 in stage 85.0 (TID 455) 2023-04-22 21:15:03.217 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 7.0 in stage 85.0 (TID 455) 2023-04-22 21:15:03.699 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 7.0 in stage 85.0 (TID 455) 2023-04-22 21:15:04.676 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 7.0 in stage 85.0 (TID 455) 2023-04-22 21:15:04.922 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 7.0 in stage 85.0 (TID 455) 2023-04-22 21:15:04.945 : INFO: TaskReport: stage=85, partition=7, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=16387, cache hits=16384 2023-04-22 21:15:04.945 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 7.0 in stage 85.0 (TID 455) 2023-04-22 21:15:04.945 Executor: INFO: Finished task 7.0 in stage 85.0 (TID 455). 961 bytes result sent to driver 2023-04-22 21:15:04.947 TaskSetManager: INFO: Starting task 8.0 in stage 85.0 (TID 456) (uger-c010.broadinstitute.org, executor driver, partition 8, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:15:04.947 TaskSetManager: INFO: Finished task 7.0 in stage 85.0 (TID 455) in 7142 ms on uger-c010.broadinstitute.org (executor driver) (8/56) 2023-04-22 21:15:04.949 Executor: INFO: Running task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:04.970 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:04.971 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:05.869 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:05.878 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:05.885 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:05.900 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:05.930 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:06.000 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:06.119 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:06.359 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:06.838 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:07.818 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:08.359 : INFO: TaskReport: stage=85, partition=8, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=14348, cache hits=10251 2023-04-22 21:15:08.359 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 8.0 in stage 85.0 (TID 456) 2023-04-22 21:15:08.367 Executor: INFO: Finished task 8.0 in stage 85.0 (TID 456). 960 bytes result sent to driver 2023-04-22 21:15:08.367 TaskSetManager: INFO: Starting task 9.0 in stage 85.0 (TID 457) (uger-c010.broadinstitute.org, executor driver, partition 9, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:15:08.368 TaskSetManager: INFO: Finished task 8.0 in stage 85.0 (TID 456) in 3421 ms on uger-c010.broadinstitute.org (executor driver) (9/56) 2023-04-22 21:15:08.368 Executor: INFO: Running task 9.0 in stage 85.0 (TID 457) 2023-04-22 21:15:08.390 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 9.0 in stage 85.0 (TID 457) 2023-04-22 21:15:08.390 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 9.0 in stage 85.0 (TID 457) 2023-04-22 21:15:09.341 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 9.0 in stage 85.0 (TID 457) 2023-04-22 21:15:09.823 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 9.0 in stage 85.0 (TID 457) 2023-04-22 21:15:10.786 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 9.0 in stage 85.0 (TID 457) 2023-04-22 21:15:11.031 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 9.0 in stage 85.0 (TID 457) 2023-04-22 21:15:11.056 : INFO: TaskReport: stage=85, partition=9, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=6159, cache hits=6156 2023-04-22 21:15:11.056 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 9.0 in stage 85.0 (TID 457) 2023-04-22 21:15:11.057 Executor: INFO: Finished task 9.0 in stage 85.0 (TID 457). 961 bytes result sent to driver 2023-04-22 21:15:11.057 TaskSetManager: INFO: Starting task 10.0 in stage 85.0 (TID 458) (uger-c010.broadinstitute.org, executor driver, partition 10, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:15:11.057 TaskSetManager: INFO: Finished task 9.0 in stage 85.0 (TID 457) in 2690 ms on uger-c010.broadinstitute.org (executor driver) (10/56) 2023-04-22 21:15:11.058 Executor: INFO: Running task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:11.079 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:11.080 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:13.725 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:13.729 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:13.736 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:13.751 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:13.781 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:13.841 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:13.960 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:14.203 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:14.680 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:15.658 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:16.211 : INFO: TaskReport: stage=85, partition=10, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=18444, cache hits=14347 2023-04-22 21:15:16.211 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 10.0 in stage 85.0 (TID 458) 2023-04-22 21:15:16.221 Executor: INFO: Finished task 10.0 in stage 85.0 (TID 458). 960 bytes result sent to driver 2023-04-22 21:15:16.221 TaskSetManager: INFO: Starting task 11.0 in stage 85.0 (TID 459) (uger-c010.broadinstitute.org, executor driver, partition 11, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:15:16.221 TaskSetManager: INFO: Finished task 10.0 in stage 85.0 (TID 458) in 5164 ms on uger-c010.broadinstitute.org (executor driver) (11/56) 2023-04-22 21:15:16.224 Executor: INFO: Running task 11.0 in stage 85.0 (TID 459) 2023-04-22 21:15:16.246 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 11.0 in stage 85.0 (TID 459) 2023-04-22 21:15:16.246 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 11.0 in stage 85.0 (TID 459) 2023-04-22 21:15:19.153 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 11.0 in stage 85.0 (TID 459) 2023-04-22 21:15:19.636 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 11.0 in stage 85.0 (TID 459) 2023-04-22 21:15:20.599 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 11.0 in stage 85.0 (TID 459) 2023-04-22 21:15:20.844 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 11.0 in stage 85.0 (TID 459) 2023-04-22 21:15:20.875 : INFO: TaskReport: stage=85, partition=11, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=10255, cache hits=10252 2023-04-22 21:15:20.875 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 11.0 in stage 85.0 (TID 459) 2023-04-22 21:15:20.876 Executor: INFO: Finished task 11.0 in stage 85.0 (TID 459). 961 bytes result sent to driver 2023-04-22 21:15:20.876 TaskSetManager: INFO: Starting task 12.0 in stage 85.0 (TID 460) (uger-c010.broadinstitute.org, executor driver, partition 12, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:15:20.876 TaskSetManager: INFO: Finished task 11.0 in stage 85.0 (TID 459) in 4655 ms on uger-c010.broadinstitute.org (executor driver) (12/56) 2023-04-22 21:15:20.893 Executor: INFO: Running task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:20.929 : INFO: RegionPool: initialized for thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:20.929 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:25.694 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:25.698 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:25.705 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:25.720 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:25.750 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:25.810 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:25.930 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:26.168 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:26.646 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:27.627 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:28.193 : INFO: TaskReport: stage=85, partition=12, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=22540, cache hits=18443 2023-04-22 21:15:28.193 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 359: Executor task launch worker for task 12.0 in stage 85.0 (TID 460) 2023-04-22 21:15:28.201 Executor: INFO: Finished task 12.0 in stage 85.0 (TID 460). 960 bytes result sent to driver 2023-04-22 21:15:28.201 TaskSetManager: INFO: Starting task 13.0 in stage 85.0 (TID 461) (uger-c010.broadinstitute.org, executor driver, partition 13, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:15:28.202 TaskSetManager: INFO: Finished task 12.0 in stage 85.0 (TID 460) in 7325 ms on uger-c010.broadinstitute.org (executor driver) (13/56) 2023-04-22 21:15:28.202 Executor: INFO: Running task 13.0 in stage 85.0 (TID 461) 2023-04-22 21:15:28.224 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 13.0 in stage 85.0 (TID 461) 2023-04-22 21:15:28.225 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 13.0 in stage 85.0 (TID 461) 2023-04-22 21:15:32.664 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 13.0 in stage 85.0 (TID 461) 2023-04-22 21:15:33.147 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 13.0 in stage 85.0 (TID 461) 2023-04-22 21:15:34.113 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 13.0 in stage 85.0 (TID 461) 2023-04-22 21:15:34.359 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 13.0 in stage 85.0 (TID 461) 2023-04-22 21:15:34.381 : INFO: TaskReport: stage=85, partition=13, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=14351, cache hits=14348 2023-04-22 21:15:34.381 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 13.0 in stage 85.0 (TID 461) 2023-04-22 21:15:34.382 Executor: INFO: Finished task 13.0 in stage 85.0 (TID 461). 961 bytes result sent to driver 2023-04-22 21:15:34.382 TaskSetManager: INFO: Starting task 14.0 in stage 85.0 (TID 462) (uger-c010.broadinstitute.org, executor driver, partition 14, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:15:34.382 TaskSetManager: INFO: Finished task 13.0 in stage 85.0 (TID 461) in 6181 ms on uger-c010.broadinstitute.org (executor driver) (14/56) 2023-04-22 21:15:34.383 Executor: INFO: Running task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:34.405 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:34.405 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:34.418 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:34.422 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:34.429 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:34.445 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:34.475 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:34.535 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:34.656 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:34.897 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:35.374 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:36.333 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:36.879 : INFO: TaskReport: stage=85, partition=14, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=12312, cache hits=8215 2023-04-22 21:15:36.879 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 85.0 (TID 462) 2023-04-22 21:15:36.882 Executor: INFO: Finished task 14.0 in stage 85.0 (TID 462). 960 bytes result sent to driver 2023-04-22 21:15:36.882 TaskSetManager: INFO: Starting task 15.0 in stage 85.0 (TID 463) (uger-c010.broadinstitute.org, executor driver, partition 15, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:15:36.883 TaskSetManager: INFO: Finished task 14.0 in stage 85.0 (TID 462) in 2501 ms on uger-c010.broadinstitute.org (executor driver) (15/56) 2023-04-22 21:15:36.883 Executor: INFO: Running task 15.0 in stage 85.0 (TID 463) 2023-04-22 21:15:36.905 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 15.0 in stage 85.0 (TID 463) 2023-04-22 21:15:36.906 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 15.0 in stage 85.0 (TID 463) 2023-04-22 21:15:36.978 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 15.0 in stage 85.0 (TID 463) 2023-04-22 21:15:37.460 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 15.0 in stage 85.0 (TID 463) 2023-04-22 21:15:38.425 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 15.0 in stage 85.0 (TID 463) 2023-04-22 21:15:38.670 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 15.0 in stage 85.0 (TID 463) 2023-04-22 21:15:38.696 : INFO: TaskReport: stage=85, partition=15, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=4123, cache hits=4120 2023-04-22 21:15:38.696 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 15.0 in stage 85.0 (TID 463) 2023-04-22 21:15:38.697 Executor: INFO: Finished task 15.0 in stage 85.0 (TID 463). 961 bytes result sent to driver 2023-04-22 21:15:38.697 TaskSetManager: INFO: Starting task 16.0 in stage 85.0 (TID 464) (uger-c010.broadinstitute.org, executor driver, partition 16, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:15:38.697 TaskSetManager: INFO: Finished task 15.0 in stage 85.0 (TID 463) in 1815 ms on uger-c010.broadinstitute.org (executor driver) (16/56) 2023-04-22 21:15:38.698 Executor: INFO: Running task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:38.722 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:38.722 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:40.495 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:40.499 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:40.507 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:40.522 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:40.552 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:40.612 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:40.732 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:40.973 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:41.454 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:42.416 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:42.959 : INFO: TaskReport: stage=85, partition=16, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=16408, cache hits=12311 2023-04-22 21:15:42.959 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 85.0 (TID 464) 2023-04-22 21:15:42.962 Executor: INFO: Finished task 16.0 in stage 85.0 (TID 464). 960 bytes result sent to driver 2023-04-22 21:15:42.963 TaskSetManager: INFO: Starting task 17.0 in stage 85.0 (TID 465) (uger-c010.broadinstitute.org, executor driver, partition 17, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:15:42.963 TaskSetManager: INFO: Finished task 16.0 in stage 85.0 (TID 464) in 4266 ms on uger-c010.broadinstitute.org (executor driver) (17/56) 2023-04-22 21:15:42.964 Executor: INFO: Running task 17.0 in stage 85.0 (TID 465) 2023-04-22 21:15:42.993 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 17.0 in stage 85.0 (TID 465) 2023-04-22 21:15:42.993 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 17.0 in stage 85.0 (TID 465) 2023-04-22 21:15:44.826 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 17.0 in stage 85.0 (TID 465) 2023-04-22 21:15:45.309 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 17.0 in stage 85.0 (TID 465) 2023-04-22 21:15:46.274 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 17.0 in stage 85.0 (TID 465) 2023-04-22 21:15:46.519 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 17.0 in stage 85.0 (TID 465) 2023-04-22 21:15:46.541 : INFO: TaskReport: stage=85, partition=17, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=8219, cache hits=8216 2023-04-22 21:15:46.542 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 17.0 in stage 85.0 (TID 465) 2023-04-22 21:15:46.542 Executor: INFO: Finished task 17.0 in stage 85.0 (TID 465). 961 bytes result sent to driver 2023-04-22 21:15:46.542 TaskSetManager: INFO: Starting task 18.0 in stage 85.0 (TID 466) (uger-c010.broadinstitute.org, executor driver, partition 18, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:15:46.542 TaskSetManager: INFO: Finished task 17.0 in stage 85.0 (TID 465) in 3580 ms on uger-c010.broadinstitute.org (executor driver) (18/56) 2023-04-22 21:15:46.543 Executor: INFO: Running task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:46.565 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:46.565 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:50.074 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:50.078 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:50.085 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:50.100 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:50.130 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:50.190 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:50.309 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:50.548 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:51.027 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:51.982 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:52.536 : INFO: TaskReport: stage=85, partition=18, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=20504, cache hits=16407 2023-04-22 21:15:52.536 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 85.0 (TID 466) 2023-04-22 21:15:52.539 Executor: INFO: Finished task 18.0 in stage 85.0 (TID 466). 960 bytes result sent to driver 2023-04-22 21:15:52.539 TaskSetManager: INFO: Starting task 19.0 in stage 85.0 (TID 467) (uger-c010.broadinstitute.org, executor driver, partition 19, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:15:52.550 TaskSetManager: INFO: Finished task 18.0 in stage 85.0 (TID 466) in 6008 ms on uger-c010.broadinstitute.org (executor driver) (19/56) 2023-04-22 21:15:52.554 Executor: INFO: Running task 19.0 in stage 85.0 (TID 467) 2023-04-22 21:15:52.576 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 19.0 in stage 85.0 (TID 467) 2023-04-22 21:15:52.576 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 19.0 in stage 85.0 (TID 467) 2023-04-22 21:15:56.142 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 19.0 in stage 85.0 (TID 467) 2023-04-22 21:15:56.624 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 19.0 in stage 85.0 (TID 467) 2023-04-22 21:15:57.600 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 19.0 in stage 85.0 (TID 467) 2023-04-22 21:15:57.845 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 19.0 in stage 85.0 (TID 467) 2023-04-22 21:15:57.868 : INFO: TaskReport: stage=85, partition=19, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=12315, cache hits=12312 2023-04-22 21:15:57.868 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 19.0 in stage 85.0 (TID 467) 2023-04-22 21:15:57.869 Executor: INFO: Finished task 19.0 in stage 85.0 (TID 467). 961 bytes result sent to driver 2023-04-22 21:15:57.869 TaskSetManager: INFO: Starting task 20.0 in stage 85.0 (TID 468) (uger-c010.broadinstitute.org, executor driver, partition 20, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:15:57.869 TaskSetManager: INFO: Finished task 19.0 in stage 85.0 (TID 467) in 5330 ms on uger-c010.broadinstitute.org (executor driver) (20/56) 2023-04-22 21:15:57.870 Executor: INFO: Running task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:15:57.891 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:15:57.891 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:03.150 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:03.154 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:03.161 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:03.176 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:03.206 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:03.266 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:03.385 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:03.702 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:04.183 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:05.139 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:05.679 : INFO: TaskReport: stage=85, partition=20, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=24600, cache hits=20503 2023-04-22 21:16:05.679 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 85.0 (TID 468) 2023-04-22 21:16:05.682 Executor: INFO: Finished task 20.0 in stage 85.0 (TID 468). 961 bytes result sent to driver 2023-04-22 21:16:05.682 TaskSetManager: INFO: Starting task 21.0 in stage 85.0 (TID 469) (uger-c010.broadinstitute.org, executor driver, partition 21, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:16:05.683 TaskSetManager: INFO: Finished task 20.0 in stage 85.0 (TID 468) in 7814 ms on uger-c010.broadinstitute.org (executor driver) (21/56) 2023-04-22 21:16:05.683 Executor: INFO: Running task 21.0 in stage 85.0 (TID 469) 2023-04-22 21:16:05.709 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 21.0 in stage 85.0 (TID 469) 2023-04-22 21:16:05.709 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 21.0 in stage 85.0 (TID 469) 2023-04-22 21:16:11.017 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 21.0 in stage 85.0 (TID 469) 2023-04-22 21:16:11.499 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 21.0 in stage 85.0 (TID 469) 2023-04-22 21:16:12.465 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 21.0 in stage 85.0 (TID 469) 2023-04-22 21:16:12.710 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 21.0 in stage 85.0 (TID 469) 2023-04-22 21:16:12.733 : INFO: TaskReport: stage=85, partition=21, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=16411, cache hits=16408 2023-04-22 21:16:12.733 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 21.0 in stage 85.0 (TID 469) 2023-04-22 21:16:12.733 Executor: INFO: Finished task 21.0 in stage 85.0 (TID 469). 961 bytes result sent to driver 2023-04-22 21:16:12.734 TaskSetManager: INFO: Starting task 22.0 in stage 85.0 (TID 470) (uger-c010.broadinstitute.org, executor driver, partition 22, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:16:12.744 TaskSetManager: INFO: Finished task 21.0 in stage 85.0 (TID 469) in 7062 ms on uger-c010.broadinstitute.org (executor driver) (22/56) 2023-04-22 21:16:12.744 Executor: INFO: Running task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:12.779 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:12.780 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:13.671 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:13.675 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:13.683 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:13.698 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:13.727 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:13.789 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:13.908 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:14.149 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:14.627 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:15.584 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:16.116 : INFO: TaskReport: stage=85, partition=22, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=14372, cache hits=10275 2023-04-22 21:16:16.116 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 85.0 (TID 470) 2023-04-22 21:16:16.119 Executor: INFO: Finished task 22.0 in stage 85.0 (TID 470). 961 bytes result sent to driver 2023-04-22 21:16:16.121 TaskSetManager: INFO: Starting task 23.0 in stage 85.0 (TID 471) (uger-c010.broadinstitute.org, executor driver, partition 23, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:16:16.121 TaskSetManager: INFO: Finished task 22.0 in stage 85.0 (TID 470) in 3388 ms on uger-c010.broadinstitute.org (executor driver) (23/56) 2023-04-22 21:16:16.122 Executor: INFO: Running task 23.0 in stage 85.0 (TID 471) 2023-04-22 21:16:16.144 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 23.0 in stage 85.0 (TID 471) 2023-04-22 21:16:16.144 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 23.0 in stage 85.0 (TID 471) 2023-04-22 21:16:17.095 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 23.0 in stage 85.0 (TID 471) 2023-04-22 21:16:17.578 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 23.0 in stage 85.0 (TID 471) 2023-04-22 21:16:17.990 BlockManagerInfo: INFO: Removed broadcast_188_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 719.0 B, free: 25.3 GiB) 2023-04-22 21:16:18.041 BlockManagerInfo: INFO: Removed broadcast_187_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 155.0 B, free: 25.3 GiB) 2023-04-22 21:16:18.611 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 23.0 in stage 85.0 (TID 471) 2023-04-22 21:16:18.857 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 23.0 in stage 85.0 (TID 471) 2023-04-22 21:16:18.887 : INFO: TaskReport: stage=85, partition=23, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=6183, cache hits=6180 2023-04-22 21:16:18.887 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 23.0 in stage 85.0 (TID 471) 2023-04-22 21:16:18.888 Executor: INFO: Finished task 23.0 in stage 85.0 (TID 471). 1004 bytes result sent to driver 2023-04-22 21:16:18.889 TaskSetManager: INFO: Starting task 24.0 in stage 85.0 (TID 472) (uger-c010.broadinstitute.org, executor driver, partition 24, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:16:18.892 TaskSetManager: INFO: Finished task 23.0 in stage 85.0 (TID 471) in 2771 ms on uger-c010.broadinstitute.org (executor driver) (24/56) 2023-04-22 21:16:18.902 Executor: INFO: Running task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:18.927 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:18.928 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:21.567 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:21.571 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:21.579 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:21.594 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:21.624 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:21.683 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:21.803 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:22.041 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:22.521 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:23.480 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:24.030 : INFO: TaskReport: stage=85, partition=24, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=18468, cache hits=14371 2023-04-22 21:16:24.030 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 85.0 (TID 472) 2023-04-22 21:16:24.033 Executor: INFO: Finished task 24.0 in stage 85.0 (TID 472). 961 bytes result sent to driver 2023-04-22 21:16:24.033 TaskSetManager: INFO: Starting task 25.0 in stage 85.0 (TID 473) (uger-c010.broadinstitute.org, executor driver, partition 25, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:16:24.034 TaskSetManager: INFO: Finished task 24.0 in stage 85.0 (TID 472) in 5145 ms on uger-c010.broadinstitute.org (executor driver) (25/56) 2023-04-22 21:16:24.034 Executor: INFO: Running task 25.0 in stage 85.0 (TID 473) 2023-04-22 21:16:24.056 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 25.0 in stage 85.0 (TID 473) 2023-04-22 21:16:24.057 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 25.0 in stage 85.0 (TID 473) 2023-04-22 21:16:26.770 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 25.0 in stage 85.0 (TID 473) 2023-04-22 21:16:27.252 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 25.0 in stage 85.0 (TID 473) 2023-04-22 21:16:28.216 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 25.0 in stage 85.0 (TID 473) 2023-04-22 21:16:28.461 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 25.0 in stage 85.0 (TID 473) 2023-04-22 21:16:28.484 : INFO: TaskReport: stage=85, partition=25, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=10279, cache hits=10276 2023-04-22 21:16:28.484 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 25.0 in stage 85.0 (TID 473) 2023-04-22 21:16:28.484 Executor: INFO: Finished task 25.0 in stage 85.0 (TID 473). 961 bytes result sent to driver 2023-04-22 21:16:28.488 TaskSetManager: INFO: Starting task 26.0 in stage 85.0 (TID 474) (uger-c010.broadinstitute.org, executor driver, partition 26, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:16:28.488 TaskSetManager: INFO: Finished task 25.0 in stage 85.0 (TID 473) in 4455 ms on uger-c010.broadinstitute.org (executor driver) (26/56) 2023-04-22 21:16:28.488 Executor: INFO: Running task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:28.510 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:28.510 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:32.896 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:32.899 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:32.907 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:32.922 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:32.952 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:33.011 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:33.131 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:33.369 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:33.847 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:34.811 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:35.311 : INFO: TaskReport: stage=85, partition=26, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=22564, cache hits=18467 2023-04-22 21:16:35.311 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 85.0 (TID 474) 2023-04-22 21:16:35.314 Executor: INFO: Finished task 26.0 in stage 85.0 (TID 474). 961 bytes result sent to driver 2023-04-22 21:16:35.314 TaskSetManager: INFO: Starting task 27.0 in stage 85.0 (TID 475) (uger-c010.broadinstitute.org, executor driver, partition 27, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:16:35.314 TaskSetManager: INFO: Finished task 26.0 in stage 85.0 (TID 474) in 6826 ms on uger-c010.broadinstitute.org (executor driver) (27/56) 2023-04-22 21:16:35.315 Executor: INFO: Running task 27.0 in stage 85.0 (TID 475) 2023-04-22 21:16:35.337 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 27.0 in stage 85.0 (TID 475) 2023-04-22 21:16:35.337 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 27.0 in stage 85.0 (TID 475) 2023-04-22 21:16:39.786 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 27.0 in stage 85.0 (TID 475) 2023-04-22 21:16:40.269 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 27.0 in stage 85.0 (TID 475) 2023-04-22 21:16:41.235 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 27.0 in stage 85.0 (TID 475) 2023-04-22 21:16:41.482 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 27.0 in stage 85.0 (TID 475) 2023-04-22 21:16:41.510 : INFO: TaskReport: stage=85, partition=27, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=14375, cache hits=14372 2023-04-22 21:16:41.510 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 27.0 in stage 85.0 (TID 475) 2023-04-22 21:16:41.510 Executor: INFO: Finished task 27.0 in stage 85.0 (TID 475). 961 bytes result sent to driver 2023-04-22 21:16:41.511 TaskSetManager: INFO: Starting task 28.0 in stage 85.0 (TID 476) (uger-c010.broadinstitute.org, executor driver, partition 28, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:16:41.511 TaskSetManager: INFO: Finished task 27.0 in stage 85.0 (TID 475) in 6197 ms on uger-c010.broadinstitute.org (executor driver) (28/56) 2023-04-22 21:16:41.512 Executor: INFO: Running task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:41.533 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:41.533 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:41.555 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:41.559 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:41.566 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:41.581 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:41.611 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:41.673 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:41.793 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:42.032 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:42.512 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:43.468 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:44.039 : INFO: TaskReport: stage=85, partition=28, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=12336, cache hits=8239 2023-04-22 21:16:44.039 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 85.0 (TID 476) 2023-04-22 21:16:44.042 Executor: INFO: Finished task 28.0 in stage 85.0 (TID 476). 961 bytes result sent to driver 2023-04-22 21:16:44.042 TaskSetManager: INFO: Starting task 29.0 in stage 85.0 (TID 477) (uger-c010.broadinstitute.org, executor driver, partition 29, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:16:44.043 TaskSetManager: INFO: Finished task 28.0 in stage 85.0 (TID 476) in 2531 ms on uger-c010.broadinstitute.org (executor driver) (29/56) 2023-04-22 21:16:44.046 Executor: INFO: Running task 29.0 in stage 85.0 (TID 477) 2023-04-22 21:16:44.067 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 29.0 in stage 85.0 (TID 477) 2023-04-22 21:16:44.067 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 29.0 in stage 85.0 (TID 477) 2023-04-22 21:16:44.153 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 29.0 in stage 85.0 (TID 477) 2023-04-22 21:16:44.641 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 29.0 in stage 85.0 (TID 477) 2023-04-22 21:16:45.608 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 29.0 in stage 85.0 (TID 477) 2023-04-22 21:16:45.853 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 29.0 in stage 85.0 (TID 477) 2023-04-22 21:16:45.875 : INFO: TaskReport: stage=85, partition=29, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=4147, cache hits=4144 2023-04-22 21:16:45.875 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 29.0 in stage 85.0 (TID 477) 2023-04-22 21:16:45.875 Executor: INFO: Finished task 29.0 in stage 85.0 (TID 477). 961 bytes result sent to driver 2023-04-22 21:16:45.875 TaskSetManager: INFO: Starting task 30.0 in stage 85.0 (TID 478) (uger-c010.broadinstitute.org, executor driver, partition 30, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:16:45.875 TaskSetManager: INFO: Finished task 29.0 in stage 85.0 (TID 477) in 1833 ms on uger-c010.broadinstitute.org (executor driver) (30/56) 2023-04-22 21:16:45.876 Executor: INFO: Running task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:45.897 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:45.897 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:47.665 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:47.669 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:47.676 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:47.691 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:47.721 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:47.781 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:47.900 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:48.141 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:48.619 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:49.575 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:50.091 : INFO: TaskReport: stage=85, partition=30, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=16432, cache hits=12335 2023-04-22 21:16:50.092 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 85.0 (TID 478) 2023-04-22 21:16:50.094 Executor: INFO: Finished task 30.0 in stage 85.0 (TID 478). 961 bytes result sent to driver 2023-04-22 21:16:50.095 TaskSetManager: INFO: Starting task 31.0 in stage 85.0 (TID 479) (uger-c010.broadinstitute.org, executor driver, partition 31, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:16:50.095 TaskSetManager: INFO: Finished task 30.0 in stage 85.0 (TID 478) in 4220 ms on uger-c010.broadinstitute.org (executor driver) (31/56) 2023-04-22 21:16:50.096 Executor: INFO: Running task 31.0 in stage 85.0 (TID 479) 2023-04-22 21:16:50.117 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 31.0 in stage 85.0 (TID 479) 2023-04-22 21:16:50.117 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 31.0 in stage 85.0 (TID 479) 2023-04-22 21:16:51.948 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 31.0 in stage 85.0 (TID 479) 2023-04-22 21:16:52.432 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 31.0 in stage 85.0 (TID 479) 2023-04-22 21:16:53.398 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 31.0 in stage 85.0 (TID 479) 2023-04-22 21:16:53.643 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 31.0 in stage 85.0 (TID 479) 2023-04-22 21:16:53.665 : INFO: TaskReport: stage=85, partition=31, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=8243, cache hits=8240 2023-04-22 21:16:53.665 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 31.0 in stage 85.0 (TID 479) 2023-04-22 21:16:53.665 Executor: INFO: Finished task 31.0 in stage 85.0 (TID 479). 961 bytes result sent to driver 2023-04-22 21:16:53.665 TaskSetManager: INFO: Starting task 32.0 in stage 85.0 (TID 480) (uger-c010.broadinstitute.org, executor driver, partition 32, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:16:53.666 TaskSetManager: INFO: Finished task 31.0 in stage 85.0 (TID 479) in 3571 ms on uger-c010.broadinstitute.org (executor driver) (32/56) 2023-04-22 21:16:53.667 Executor: INFO: Running task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:53.689 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:53.689 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:57.229 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:57.233 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:57.240 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:57.255 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:57.286 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:57.346 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:57.466 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:57.706 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:58.187 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:59.149 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:59.694 : INFO: TaskReport: stage=85, partition=32, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=20528, cache hits=16431 2023-04-22 21:16:59.694 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 85.0 (TID 480) 2023-04-22 21:16:59.696 Executor: INFO: Finished task 32.0 in stage 85.0 (TID 480). 961 bytes result sent to driver 2023-04-22 21:16:59.697 TaskSetManager: INFO: Starting task 33.0 in stage 85.0 (TID 481) (uger-c010.broadinstitute.org, executor driver, partition 33, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:16:59.697 TaskSetManager: INFO: Finished task 32.0 in stage 85.0 (TID 480) in 6032 ms on uger-c010.broadinstitute.org (executor driver) (33/56) 2023-04-22 21:16:59.698 Executor: INFO: Running task 33.0 in stage 85.0 (TID 481) 2023-04-22 21:16:59.724 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 33.0 in stage 85.0 (TID 481) 2023-04-22 21:16:59.724 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 33.0 in stage 85.0 (TID 481) 2023-04-22 21:17:03.317 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 33.0 in stage 85.0 (TID 481) 2023-04-22 21:17:03.800 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 33.0 in stage 85.0 (TID 481) 2023-04-22 21:17:04.769 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 33.0 in stage 85.0 (TID 481) 2023-04-22 21:17:05.013 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 33.0 in stage 85.0 (TID 481) 2023-04-22 21:17:05.040 : INFO: TaskReport: stage=85, partition=33, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=12339, cache hits=12336 2023-04-22 21:17:05.040 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 33.0 in stage 85.0 (TID 481) 2023-04-22 21:17:05.042 Executor: INFO: Finished task 33.0 in stage 85.0 (TID 481). 961 bytes result sent to driver 2023-04-22 21:17:05.042 TaskSetManager: INFO: Starting task 34.0 in stage 85.0 (TID 482) (uger-c010.broadinstitute.org, executor driver, partition 34, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:17:05.053 TaskSetManager: INFO: Finished task 33.0 in stage 85.0 (TID 481) in 5356 ms on uger-c010.broadinstitute.org (executor driver) (34/56) 2023-04-22 21:17:05.053 Executor: INFO: Running task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:05.095 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:05.096 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:10.363 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:10.367 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:10.374 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:10.389 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:10.419 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:10.478 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:10.599 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:10.838 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:11.339 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:12.298 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:12.823 : INFO: TaskReport: stage=85, partition=34, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=24624, cache hits=20527 2023-04-22 21:17:12.823 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 85.0 (TID 482) 2023-04-22 21:17:12.826 Executor: INFO: Finished task 34.0 in stage 85.0 (TID 482). 961 bytes result sent to driver 2023-04-22 21:17:12.826 TaskSetManager: INFO: Starting task 35.0 in stage 85.0 (TID 483) (uger-c010.broadinstitute.org, executor driver, partition 35, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:17:12.827 TaskSetManager: INFO: Finished task 34.0 in stage 85.0 (TID 482) in 7785 ms on uger-c010.broadinstitute.org (executor driver) (35/56) 2023-04-22 21:17:12.827 Executor: INFO: Running task 35.0 in stage 85.0 (TID 483) 2023-04-22 21:17:12.849 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 35.0 in stage 85.0 (TID 483) 2023-04-22 21:17:12.849 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 35.0 in stage 85.0 (TID 483) 2023-04-22 21:17:18.196 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 35.0 in stage 85.0 (TID 483) 2023-04-22 21:17:18.678 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 35.0 in stage 85.0 (TID 483) 2023-04-22 21:17:19.650 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 35.0 in stage 85.0 (TID 483) 2023-04-22 21:17:19.896 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 35.0 in stage 85.0 (TID 483) 2023-04-22 21:17:19.922 : INFO: TaskReport: stage=85, partition=35, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=16435, cache hits=16432 2023-04-22 21:17:19.922 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 35.0 in stage 85.0 (TID 483) 2023-04-22 21:17:19.922 Executor: INFO: Finished task 35.0 in stage 85.0 (TID 483). 961 bytes result sent to driver 2023-04-22 21:17:19.923 TaskSetManager: INFO: Starting task 36.0 in stage 85.0 (TID 484) (uger-c010.broadinstitute.org, executor driver, partition 36, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:17:19.928 TaskSetManager: INFO: Finished task 35.0 in stage 85.0 (TID 483) in 7102 ms on uger-c010.broadinstitute.org (executor driver) (36/56) 2023-04-22 21:17:19.931 Executor: INFO: Running task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:19.952 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:19.953 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:20.853 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:20.857 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:20.864 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:20.879 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:20.909 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:20.969 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:21.088 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:21.327 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:21.804 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:22.760 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:23.307 : INFO: TaskReport: stage=85, partition=36, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=14396, cache hits=10299 2023-04-22 21:17:23.307 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 85.0 (TID 484) 2023-04-22 21:17:23.310 Executor: INFO: Finished task 36.0 in stage 85.0 (TID 484). 961 bytes result sent to driver 2023-04-22 21:17:23.313 TaskSetManager: INFO: Starting task 37.0 in stage 85.0 (TID 485) (uger-c010.broadinstitute.org, executor driver, partition 37, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:17:23.314 TaskSetManager: INFO: Finished task 36.0 in stage 85.0 (TID 484) in 3392 ms on uger-c010.broadinstitute.org (executor driver) (37/56) 2023-04-22 21:17:23.332 Executor: INFO: Running task 37.0 in stage 85.0 (TID 485) 2023-04-22 21:17:23.363 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 37.0 in stage 85.0 (TID 485) 2023-04-22 21:17:23.363 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 37.0 in stage 85.0 (TID 485) 2023-04-22 21:17:24.324 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 37.0 in stage 85.0 (TID 485) 2023-04-22 21:17:24.806 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 37.0 in stage 85.0 (TID 485) 2023-04-22 21:17:25.772 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 37.0 in stage 85.0 (TID 485) 2023-04-22 21:17:26.017 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 37.0 in stage 85.0 (TID 485) 2023-04-22 21:17:26.042 : INFO: TaskReport: stage=85, partition=37, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=6207, cache hits=6204 2023-04-22 21:17:26.042 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 37.0 in stage 85.0 (TID 485) 2023-04-22 21:17:26.043 Executor: INFO: Finished task 37.0 in stage 85.0 (TID 485). 961 bytes result sent to driver 2023-04-22 21:17:26.043 TaskSetManager: INFO: Starting task 38.0 in stage 85.0 (TID 486) (uger-c010.broadinstitute.org, executor driver, partition 38, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:17:26.043 TaskSetManager: INFO: Finished task 37.0 in stage 85.0 (TID 485) in 2730 ms on uger-c010.broadinstitute.org (executor driver) (38/56) 2023-04-22 21:17:26.045 Executor: INFO: Running task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:26.066 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:26.067 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:28.722 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:28.726 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:28.733 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:28.748 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:28.778 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:28.838 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:28.958 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:29.197 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:29.680 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:30.637 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:31.189 : INFO: TaskReport: stage=85, partition=38, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=18492, cache hits=14395 2023-04-22 21:17:31.189 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 85.0 (TID 486) 2023-04-22 21:17:31.192 Executor: INFO: Finished task 38.0 in stage 85.0 (TID 486). 961 bytes result sent to driver 2023-04-22 21:17:31.192 TaskSetManager: INFO: Starting task 39.0 in stage 85.0 (TID 487) (uger-c010.broadinstitute.org, executor driver, partition 39, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:17:31.192 TaskSetManager: INFO: Finished task 38.0 in stage 85.0 (TID 486) in 5149 ms on uger-c010.broadinstitute.org (executor driver) (39/56) 2023-04-22 21:17:31.195 Executor: INFO: Running task 39.0 in stage 85.0 (TID 487) 2023-04-22 21:17:31.216 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 39.0 in stage 85.0 (TID 487) 2023-04-22 21:17:31.217 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 39.0 in stage 85.0 (TID 487) 2023-04-22 21:17:33.941 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 39.0 in stage 85.0 (TID 487) 2023-04-22 21:17:34.430 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 39.0 in stage 85.0 (TID 487) 2023-04-22 21:17:35.396 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 39.0 in stage 85.0 (TID 487) 2023-04-22 21:17:35.641 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 39.0 in stage 85.0 (TID 487) 2023-04-22 21:17:35.667 : INFO: TaskReport: stage=85, partition=39, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=10303, cache hits=10300 2023-04-22 21:17:35.667 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 39.0 in stage 85.0 (TID 487) 2023-04-22 21:17:35.668 Executor: INFO: Finished task 39.0 in stage 85.0 (TID 487). 961 bytes result sent to driver 2023-04-22 21:17:35.668 TaskSetManager: INFO: Starting task 40.0 in stage 85.0 (TID 488) (uger-c010.broadinstitute.org, executor driver, partition 40, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:17:35.668 TaskSetManager: INFO: Finished task 39.0 in stage 85.0 (TID 487) in 4476 ms on uger-c010.broadinstitute.org (executor driver) (40/56) 2023-04-22 21:17:35.669 Executor: INFO: Running task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:35.690 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:35.690 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:40.082 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:40.086 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:40.094 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:40.109 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:40.139 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:40.198 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:40.319 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:40.558 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:41.040 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:41.997 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:42.542 : INFO: TaskReport: stage=85, partition=40, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=22588, cache hits=18491 2023-04-22 21:17:42.542 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 85.0 (TID 488) 2023-04-22 21:17:42.545 Executor: INFO: Finished task 40.0 in stage 85.0 (TID 488). 961 bytes result sent to driver 2023-04-22 21:17:42.546 TaskSetManager: INFO: Starting task 41.0 in stage 85.0 (TID 489) (uger-c010.broadinstitute.org, executor driver, partition 41, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:17:42.546 TaskSetManager: INFO: Finished task 40.0 in stage 85.0 (TID 488) in 6878 ms on uger-c010.broadinstitute.org (executor driver) (41/56) 2023-04-22 21:17:42.546 Executor: INFO: Running task 41.0 in stage 85.0 (TID 489) 2023-04-22 21:17:42.568 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 41.0 in stage 85.0 (TID 489) 2023-04-22 21:17:42.568 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 41.0 in stage 85.0 (TID 489) 2023-04-22 21:17:47.024 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 41.0 in stage 85.0 (TID 489) 2023-04-22 21:17:47.508 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 41.0 in stage 85.0 (TID 489) 2023-04-22 21:17:48.475 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 41.0 in stage 85.0 (TID 489) 2023-04-22 21:17:48.722 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 41.0 in stage 85.0 (TID 489) 2023-04-22 21:17:48.745 : INFO: TaskReport: stage=85, partition=41, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=14399, cache hits=14396 2023-04-22 21:17:48.745 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 41.0 in stage 85.0 (TID 489) 2023-04-22 21:17:48.745 Executor: INFO: Finished task 41.0 in stage 85.0 (TID 489). 961 bytes result sent to driver 2023-04-22 21:17:48.745 TaskSetManager: INFO: Starting task 42.0 in stage 85.0 (TID 490) (uger-c010.broadinstitute.org, executor driver, partition 42, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:17:48.746 TaskSetManager: INFO: Finished task 41.0 in stage 85.0 (TID 489) in 6201 ms on uger-c010.broadinstitute.org (executor driver) (42/56) 2023-04-22 21:17:48.746 Executor: INFO: Running task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:48.768 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:48.768 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:48.800 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:48.804 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:48.812 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:48.827 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:48.857 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:48.916 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:49.036 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:49.275 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:49.754 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:50.711 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:51.230 : INFO: TaskReport: stage=85, partition=42, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=12360, cache hits=8263 2023-04-22 21:17:51.230 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 85.0 (TID 490) 2023-04-22 21:17:51.233 Executor: INFO: Finished task 42.0 in stage 85.0 (TID 490). 961 bytes result sent to driver 2023-04-22 21:17:51.235 TaskSetManager: INFO: Starting task 43.0 in stage 85.0 (TID 491) (uger-c010.broadinstitute.org, executor driver, partition 43, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:17:51.246 TaskSetManager: INFO: Finished task 42.0 in stage 85.0 (TID 490) in 2501 ms on uger-c010.broadinstitute.org (executor driver) (43/56) 2023-04-22 21:17:51.256 Executor: INFO: Running task 43.0 in stage 85.0 (TID 491) 2023-04-22 21:17:51.278 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 43.0 in stage 85.0 (TID 491) 2023-04-22 21:17:51.278 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 43.0 in stage 85.0 (TID 491) 2023-04-22 21:17:51.371 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 43.0 in stage 85.0 (TID 491) 2023-04-22 21:17:51.857 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 43.0 in stage 85.0 (TID 491) 2023-04-22 21:17:52.823 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 43.0 in stage 85.0 (TID 491) 2023-04-22 21:17:53.069 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 43.0 in stage 85.0 (TID 491) 2023-04-22 21:17:53.094 : INFO: TaskReport: stage=85, partition=43, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=4171, cache hits=4168 2023-04-22 21:17:53.094 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 43.0 in stage 85.0 (TID 491) 2023-04-22 21:17:53.094 Executor: INFO: Finished task 43.0 in stage 85.0 (TID 491). 961 bytes result sent to driver 2023-04-22 21:17:53.095 TaskSetManager: INFO: Starting task 44.0 in stage 85.0 (TID 492) (uger-c010.broadinstitute.org, executor driver, partition 44, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:17:53.095 TaskSetManager: INFO: Finished task 43.0 in stage 85.0 (TID 491) in 1860 ms on uger-c010.broadinstitute.org (executor driver) (44/56) 2023-04-22 21:17:53.095 Executor: INFO: Running task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:53.117 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:53.117 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:54.903 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:54.906 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:54.914 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:54.929 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:54.959 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:55.018 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:55.138 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:55.377 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:55.855 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:56.812 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:57.322 : INFO: TaskReport: stage=85, partition=44, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=16456, cache hits=12359 2023-04-22 21:17:57.322 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 85.0 (TID 492) 2023-04-22 21:17:57.325 Executor: INFO: Finished task 44.0 in stage 85.0 (TID 492). 961 bytes result sent to driver 2023-04-22 21:17:57.325 TaskSetManager: INFO: Starting task 45.0 in stage 85.0 (TID 493) (uger-c010.broadinstitute.org, executor driver, partition 45, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:17:57.326 TaskSetManager: INFO: Finished task 44.0 in stage 85.0 (TID 492) in 4232 ms on uger-c010.broadinstitute.org (executor driver) (45/56) 2023-04-22 21:17:57.327 Executor: INFO: Running task 45.0 in stage 85.0 (TID 493) 2023-04-22 21:17:57.353 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 45.0 in stage 85.0 (TID 493) 2023-04-22 21:17:57.353 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 45.0 in stage 85.0 (TID 493) 2023-04-22 21:17:59.192 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 45.0 in stage 85.0 (TID 493) 2023-04-22 21:17:59.677 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 45.0 in stage 85.0 (TID 493) 2023-04-22 21:18:00.644 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 45.0 in stage 85.0 (TID 493) 2023-04-22 21:18:00.889 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 45.0 in stage 85.0 (TID 493) 2023-04-22 21:18:00.912 : INFO: TaskReport: stage=85, partition=45, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=8267, cache hits=8264 2023-04-22 21:18:00.912 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 45.0 in stage 85.0 (TID 493) 2023-04-22 21:18:00.913 Executor: INFO: Finished task 45.0 in stage 85.0 (TID 493). 961 bytes result sent to driver 2023-04-22 21:18:00.913 TaskSetManager: INFO: Starting task 46.0 in stage 85.0 (TID 494) (uger-c010.broadinstitute.org, executor driver, partition 46, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:18:00.913 TaskSetManager: INFO: Finished task 45.0 in stage 85.0 (TID 493) in 3588 ms on uger-c010.broadinstitute.org (executor driver) (46/56) 2023-04-22 21:18:00.914 Executor: INFO: Running task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:00.935 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:00.935 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:04.463 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:04.467 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:04.474 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:04.489 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:04.519 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:04.579 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:04.699 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:04.938 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:05.416 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:06.372 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:06.912 : INFO: TaskReport: stage=85, partition=46, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=20552, cache hits=16455 2023-04-22 21:18:06.912 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 85.0 (TID 494) 2023-04-22 21:18:06.914 Executor: INFO: Finished task 46.0 in stage 85.0 (TID 494). 961 bytes result sent to driver 2023-04-22 21:18:06.915 TaskSetManager: INFO: Starting task 47.0 in stage 85.0 (TID 495) (uger-c010.broadinstitute.org, executor driver, partition 47, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:18:06.915 TaskSetManager: INFO: Finished task 46.0 in stage 85.0 (TID 494) in 6002 ms on uger-c010.broadinstitute.org (executor driver) (47/56) 2023-04-22 21:18:06.916 Executor: INFO: Running task 47.0 in stage 85.0 (TID 495) 2023-04-22 21:18:06.937 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 47.0 in stage 85.0 (TID 495) 2023-04-22 21:18:06.937 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 47.0 in stage 85.0 (TID 495) 2023-04-22 21:18:10.520 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 47.0 in stage 85.0 (TID 495) 2023-04-22 21:18:11.002 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 47.0 in stage 85.0 (TID 495) 2023-04-22 21:18:11.966 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 47.0 in stage 85.0 (TID 495) 2023-04-22 21:18:12.217 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 47.0 in stage 85.0 (TID 495) 2023-04-22 21:18:12.246 : INFO: TaskReport: stage=85, partition=47, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=12363, cache hits=12360 2023-04-22 21:18:12.246 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 47.0 in stage 85.0 (TID 495) 2023-04-22 21:18:12.247 Executor: INFO: Finished task 47.0 in stage 85.0 (TID 495). 961 bytes result sent to driver 2023-04-22 21:18:12.247 TaskSetManager: INFO: Starting task 48.0 in stage 85.0 (TID 496) (uger-c010.broadinstitute.org, executor driver, partition 48, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:18:12.247 TaskSetManager: INFO: Finished task 47.0 in stage 85.0 (TID 495) in 5332 ms on uger-c010.broadinstitute.org (executor driver) (48/56) 2023-04-22 21:18:12.249 Executor: INFO: Running task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:12.271 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:12.271 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:17.543 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:17.547 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:17.555 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:17.570 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:17.599 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:17.659 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:17.779 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:18.018 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:18.497 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:19.454 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:20.002 : INFO: TaskReport: stage=85, partition=48, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=24648, cache hits=20551 2023-04-22 21:18:20.002 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 85.0 (TID 496) 2023-04-22 21:18:20.004 Executor: INFO: Finished task 48.0 in stage 85.0 (TID 496). 961 bytes result sent to driver 2023-04-22 21:18:20.005 TaskSetManager: INFO: Starting task 49.0 in stage 85.0 (TID 497) (uger-c010.broadinstitute.org, executor driver, partition 49, PROCESS_LOCAL, 4499 bytes) taskResourceAssignments Map() 2023-04-22 21:18:20.008 TaskSetManager: INFO: Finished task 48.0 in stage 85.0 (TID 496) in 7761 ms on uger-c010.broadinstitute.org (executor driver) (49/56) 2023-04-22 21:18:20.009 Executor: INFO: Running task 49.0 in stage 85.0 (TID 497) 2023-04-22 21:18:20.031 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 49.0 in stage 85.0 (TID 497) 2023-04-22 21:18:20.031 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 49.0 in stage 85.0 (TID 497) 2023-04-22 21:18:25.375 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 49.0 in stage 85.0 (TID 497) 2023-04-22 21:18:25.859 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 49.0 in stage 85.0 (TID 497) 2023-04-22 21:18:26.828 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 49.0 in stage 85.0 (TID 497) 2023-04-22 21:18:27.074 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 49.0 in stage 85.0 (TID 497) 2023-04-22 21:18:27.096 : INFO: TaskReport: stage=85, partition=49, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=16459, cache hits=16456 2023-04-22 21:18:27.096 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 49.0 in stage 85.0 (TID 497) 2023-04-22 21:18:27.096 Executor: INFO: Finished task 49.0 in stage 85.0 (TID 497). 961 bytes result sent to driver 2023-04-22 21:18:27.097 TaskSetManager: INFO: Starting task 50.0 in stage 85.0 (TID 498) (uger-c010.broadinstitute.org, executor driver, partition 50, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:18:27.097 TaskSetManager: INFO: Finished task 49.0 in stage 85.0 (TID 497) in 7092 ms on uger-c010.broadinstitute.org (executor driver) (50/56) 2023-04-22 21:18:27.110 Executor: INFO: Running task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:27.133 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:27.134 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:28.044 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:28.048 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:28.056 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:28.070 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:28.136 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:28.213 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:28.333 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:28.572 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:29.050 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:30.006 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:30.520 : INFO: TaskReport: stage=85, partition=50, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=14420, cache hits=10323 2023-04-22 21:18:30.520 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 85.0 (TID 498) 2023-04-22 21:18:30.523 Executor: INFO: Finished task 50.0 in stage 85.0 (TID 498). 961 bytes result sent to driver 2023-04-22 21:18:30.523 TaskSetManager: INFO: Starting task 51.0 in stage 85.0 (TID 499) (uger-c010.broadinstitute.org, executor driver, partition 51, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:18:30.524 TaskSetManager: INFO: Finished task 50.0 in stage 85.0 (TID 498) in 3427 ms on uger-c010.broadinstitute.org (executor driver) (51/56) 2023-04-22 21:18:30.524 Executor: INFO: Running task 51.0 in stage 85.0 (TID 499) 2023-04-22 21:18:30.546 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 51.0 in stage 85.0 (TID 499) 2023-04-22 21:18:30.546 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 51.0 in stage 85.0 (TID 499) 2023-04-22 21:18:31.518 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 51.0 in stage 85.0 (TID 499) 2023-04-22 21:18:32.001 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 51.0 in stage 85.0 (TID 499) 2023-04-22 21:18:32.967 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 51.0 in stage 85.0 (TID 499) 2023-04-22 21:18:33.212 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 51.0 in stage 85.0 (TID 499) 2023-04-22 21:18:33.238 : INFO: TaskReport: stage=85, partition=51, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=6231, cache hits=6228 2023-04-22 21:18:33.238 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 51.0 in stage 85.0 (TID 499) 2023-04-22 21:18:33.238 Executor: INFO: Finished task 51.0 in stage 85.0 (TID 499). 961 bytes result sent to driver 2023-04-22 21:18:33.239 TaskSetManager: INFO: Starting task 52.0 in stage 85.0 (TID 500) (uger-c010.broadinstitute.org, executor driver, partition 52, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:18:33.239 TaskSetManager: INFO: Finished task 51.0 in stage 85.0 (TID 499) in 2716 ms on uger-c010.broadinstitute.org (executor driver) (52/56) 2023-04-22 21:18:33.239 Executor: INFO: Running task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:33.261 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:33.261 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:35.920 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:35.923 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:35.931 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:35.946 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:35.976 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:36.035 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:36.155 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:36.393 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:36.871 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:37.824 : INFO: RegionPool: REPORT_THRESHOLD: 256.0M allocated (512.0K blocks / 255.5M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:38.358 : INFO: TaskReport: stage=85, partition=52, attempt=0, peakBytes=403111952, peakBytesReadable=384.44 MiB, chunks requested=18516, cache hits=14419 2023-04-22 21:18:38.358 : INFO: RegionPool: FREE: 384.4M allocated (512.0K blocks / 383.9M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 85.0 (TID 500) 2023-04-22 21:18:38.360 Executor: INFO: Finished task 52.0 in stage 85.0 (TID 500). 1004 bytes result sent to driver 2023-04-22 21:18:38.361 TaskSetManager: INFO: Starting task 53.0 in stage 85.0 (TID 501) (uger-c010.broadinstitute.org, executor driver, partition 53, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:18:38.361 TaskSetManager: INFO: Finished task 52.0 in stage 85.0 (TID 500) in 5122 ms on uger-c010.broadinstitute.org (executor driver) (53/56) 2023-04-22 21:18:38.362 Executor: INFO: Running task 53.0 in stage 85.0 (TID 501) 2023-04-22 21:18:38.386 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 53.0 in stage 85.0 (TID 501) 2023-04-22 21:18:38.386 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 53.0 in stage 85.0 (TID 501) 2023-04-22 21:18:41.109 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 53.0 in stage 85.0 (TID 501) 2023-04-22 21:18:41.592 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 53.0 in stage 85.0 (TID 501) 2023-04-22 21:18:42.557 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 53.0 in stage 85.0 (TID 501) 2023-04-22 21:18:42.802 : INFO: RegionPool: REPORT_THRESHOLD: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 53.0 in stage 85.0 (TID 501) 2023-04-22 21:18:42.831 : INFO: TaskReport: stage=85, partition=53, attempt=0, peakBytes=4521984, peakBytesReadable=4.31 MiB, chunks requested=10327, cache hits=10324 2023-04-22 21:18:42.831 : INFO: RegionPool: FREE: 4.3M allocated (2.2M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 53.0 in stage 85.0 (TID 501) 2023-04-22 21:18:42.831 Executor: INFO: Finished task 53.0 in stage 85.0 (TID 501). 961 bytes result sent to driver 2023-04-22 21:18:42.832 TaskSetManager: INFO: Starting task 54.0 in stage 85.0 (TID 502) (uger-c010.broadinstitute.org, executor driver, partition 54, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:18:42.832 TaskSetManager: INFO: Finished task 53.0 in stage 85.0 (TID 501) in 4471 ms on uger-c010.broadinstitute.org (executor driver) (54/56) 2023-04-22 21:18:42.832 Executor: INFO: Running task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:42.854 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:42.854 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:47.264 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:47.267 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (384.0K blocks / 640.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:47.275 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (384.0K blocks / 1.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:47.290 : INFO: RegionPool: REPORT_THRESHOLD: 4.0M allocated (384.0K blocks / 3.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:47.320 : INFO: RegionPool: REPORT_THRESHOLD: 8.0M allocated (384.0K blocks / 7.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:47.379 : INFO: RegionPool: REPORT_THRESHOLD: 16.0M allocated (384.0K blocks / 15.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:47.499 : INFO: RegionPool: REPORT_THRESHOLD: 32.0M allocated (384.0K blocks / 31.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:47.738 : INFO: RegionPool: REPORT_THRESHOLD: 64.0M allocated (384.0K blocks / 63.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:48.216 : INFO: RegionPool: REPORT_THRESHOLD: 128.0M allocated (448.0K blocks / 127.6M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:49.129 : INFO: RegionPool: REPORT_THRESHOLD: 375.3M allocated (448.0K blocks / 374.8M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:49.594 : INFO: TaskReport: stage=85, partition=54, attempt=0, peakBytes=393510928, peakBytesReadable=375.28 MiB, chunks requested=22321, cache hits=18321 2023-04-22 21:18:49.594 : INFO: RegionPool: FREE: 375.3M allocated (448.0K blocks / 374.8M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 85.0 (TID 502) 2023-04-22 21:18:49.597 Executor: INFO: Finished task 54.0 in stage 85.0 (TID 502). 961 bytes result sent to driver 2023-04-22 21:18:49.597 TaskSetManager: INFO: Starting task 55.0 in stage 85.0 (TID 503) (uger-c010.broadinstitute.org, executor driver, partition 55, PROCESS_LOCAL, 4418 bytes) taskResourceAssignments Map() 2023-04-22 21:18:49.598 TaskSetManager: INFO: Finished task 54.0 in stage 85.0 (TID 502) in 6766 ms on uger-c010.broadinstitute.org (executor driver) (55/56) 2023-04-22 21:18:49.598 Executor: INFO: Running task 55.0 in stage 85.0 (TID 503) 2023-04-22 21:18:49.620 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 55.0 in stage 85.0 (TID 503) 2023-04-22 21:18:49.620 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (192.0K blocks / 64.0K chunks), regions.size = 3, 0 current java objects, thread 540: Executor task launch worker for task 55.0 in stage 85.0 (TID 503) 2023-04-22 21:18:54.089 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (384.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 55.0 in stage 85.0 (TID 503) 2023-04-22 21:18:54.579 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (896.0K blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 55.0 in stage 85.0 (TID 503) 2023-04-22 21:18:55.545 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.9M blocks / 128.0K chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 55.0 in stage 85.0 (TID 503) 2023-04-22 21:18:55.748 : INFO: RegionPool: REPORT_THRESHOLD: 4.2M allocated (2.1M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 55.0 in stage 85.0 (TID 503) 2023-04-22 21:18:55.771 : INFO: TaskReport: stage=85, partition=55, attempt=0, peakBytes=4390912, peakBytesReadable=4.19 MiB, chunks requested=14325, cache hits=14323 2023-04-22 21:18:55.771 : INFO: RegionPool: FREE: 4.2M allocated (2.1M blocks / 2.1M chunks), regions.size = 5, 0 current java objects, thread 540: Executor task launch worker for task 55.0 in stage 85.0 (TID 503) 2023-04-22 21:18:55.771 Executor: INFO: Finished task 55.0 in stage 85.0 (TID 503). 961 bytes result sent to driver 2023-04-22 21:18:55.772 TaskSetManager: INFO: Finished task 55.0 in stage 85.0 (TID 503) in 6175 ms on uger-c010.broadinstitute.org (executor driver) (56/56) 2023-04-22 21:18:55.772 TaskSchedulerImpl: INFO: Removed TaskSet 85.0, whose tasks have all completed, from pool 2023-04-22 21:18:55.772 DAGScheduler: INFO: ResultStage 85 (collect at SparkBackend.scala:368) finished in 271.410 s 2023-04-22 21:18:55.772 DAGScheduler: INFO: Job 46 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:18:55.772 TaskSchedulerImpl: INFO: Killing all running tasks in stage 85: Stage finished 2023-04-22 21:18:55.773 DAGScheduler: INFO: Job 46 finished: collect at SparkBackend.scala:368, took 271.411577 s 2023-04-22 21:18:55.773 : INFO: executed D-Array [matrix_block_matrix_writer] in 4m31.6s 2023-04-22 21:18:55.856 Hail: INFO: wrote matrix with 114591 rows and 4151 columns as 56 blocks of size 4096 to /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX 2023-04-22 21:18:55.886 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:18:55.886 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:18:55.886 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:55.886 : INFO: took 5m12.3s 2023-04-22 21:18:55.886 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (Begin) 2023-04-22 21:18:55.887 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (Begin) 2023-04-22 21:18:55.888 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:18:55.888 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:18:55.888 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:55.888 : INFO: finished execution of query hail_query_7, result size is 0.00 B 2023-04-22 21:18:55.888 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:18:55.888 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:18:55.888 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:55.888 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode total 5m12.5s self 34.970ms children 5m12.5s %children 99.99% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR total 44.466ms self 0.018ms children 44.449ms %children 99.96% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.025ms self 0.025ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation total 44.302ms self 0.024ms children 44.278ms %children 99.95% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 44.278ms self 0.087ms children 44.192ms %children 99.80% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.201ms self 0.201ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.642ms self 0.642ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 9.997ms self 9.997ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.513ms self 1.513ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 1.800ms self 1.800ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.207ms self 0.207ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 2.288ms self 2.288ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.244ms self 0.244ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.269ms self 0.269ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.547ms self 0.547ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.609ms self 0.609ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 18.505ms self 18.505ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.169ms self 0.169ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 2.016ms self 2.016ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.888 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.241ms self 0.241ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.268ms self 0.268ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.527ms self 0.527ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.601ms self 0.601ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 1.461ms self 1.461ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.162ms self 0.162ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 1.922ms self 1.922ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, initial IR/Verify total 0.122ms self 0.122ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable total 34.355ms self 0.017ms children 34.338ms %children 99.95% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/LoweringTransformation total 33.902ms self 33.902ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LowerMatrixToTable/Verify total 0.419ms self 0.419ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable total 43.848ms self 0.003ms children 43.845ms %children 99.99% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 43.738ms self 0.013ms children 43.726ms %children 99.97% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 43.726ms self 0.066ms children 43.659ms %children 99.85% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.372ms self 0.372ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.514ms self 0.514ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.100ms self 1.100ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 10.550ms self 10.550ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 1.869ms self 1.869ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.163ms self 0.163ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 3.356ms self 3.356ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.199ms self 0.199ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.326ms self 0.326ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.556ms self 0.556ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 1.048ms self 1.048ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 1.895ms self 1.895ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.145ms self 0.145ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 2.643ms self 2.643ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.190ms self 0.190ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 12.888ms self 12.888ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.551ms self 0.551ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 1.289ms self 1.289ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 1.460ms self 1.460ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.134ms self 0.134ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 2.412ms self 2.412ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.067ms self 0.067ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets total 0.223ms self 0.002ms children 0.222ms %children 99.33% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.155ms self 0.155ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LiftRelationalValuesToRelationalLets/Verify total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets total 1.805ms self 0.003ms children 1.802ms %children 99.84% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/LoweringTransformation total 1.723ms self 1.723ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/EvalRelationalLets/Verify total 0.045ms self 0.045ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles total 11.170ms self 0.003ms children 11.167ms %children 99.97% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/LoweringTransformation total 0.479ms self 0.479ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/LowerAndExecuteShuffles/Verify total 10.649ms self 10.649ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles total 31.724ms self 0.003ms children 31.722ms %children 99.99% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 31.630ms self 0.012ms children 31.618ms %children 99.96% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 31.618ms self 0.073ms children 31.545ms %children 99.77% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.143ms self 0.143ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.984ms self 0.984ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.560ms self 0.560ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.953ms self 0.953ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 1.912ms self 1.912ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.137ms self 0.137ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 2.453ms self 2.453ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.201ms self 0.201ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.317ms self 0.317ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.494ms self 0.494ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 1.089ms self 1.089ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 1.401ms self 1.401ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 13.736ms self 13.736ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 1.978ms self 1.978ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.889 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.191ms self 0.191ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.308ms self 0.308ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.534ms self 0.534ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.704ms self 0.704ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 1.459ms self 1.459ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.122ms self 0.122ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 1.870ms self 1.870ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable total 5m12.3s self 0.006ms children 5m12.3s %children 100.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation total 5m12.3s self 1.095s children 5m11.2s %children 99.65% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate total 53.626ms self 12.726ms children 40.900ms %children 76.27% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR total 1.116ms self 0.003ms children 1.113ms %children 99.75% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation total 1.099ms self 0.018ms children 1.081ms %children 98.36% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 1.081ms self 0.048ms children 1.033ms %children 95.57% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.098ms self 0.098ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.050ms self 0.050ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.170ms self 0.170ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.130ms self 0.130ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.134ms self 0.134ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.106ms self 0.106ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.133ms self 0.133ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.075ms self 0.075ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerMatrixToTable total 0.029ms self 0.001ms children 0.028ms %children 95.52% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerMatrixToTable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerMatrixToTable/LoweringTransformation total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerMatrixToTable/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable total 0.099ms self 0.001ms children 0.098ms %children 98.82% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 0.096ms self 0.010ms children 0.086ms %children 89.87% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 0.086ms self 0.009ms children 0.077ms %children 89.99% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LiftRelationalValuesToRelationalLets total 0.020ms self 0.001ms children 0.019ms %children 94.33% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LiftRelationalValuesToRelationalLets/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LiftRelationalValuesToRelationalLets/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/EvalRelationalLets total 0.010ms self 0.001ms children 0.009ms %children 92.28% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/EvalRelationalLets/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/EvalRelationalLets/LoweringTransformation total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/EvalRelationalLets/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerAndExecuteShuffles total 0.013ms self 0.001ms children 0.013ms %children 94.45% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerAndExecuteShuffles/LoweringTransformation total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerAndExecuteShuffles/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.890 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles total 0.082ms self 0.002ms children 0.080ms %children 98.07% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 0.078ms self 0.006ms children 0.073ms %children 92.55% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 0.073ms self 0.007ms children 0.065ms %children 90.14% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerOrInterpretNonCompilable total 0.025ms self 0.001ms children 0.024ms %children 96.40% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerOrInterpretNonCompilable/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerOrInterpretNonCompilable/LoweringTransformation total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerOrInterpretNonCompilable/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.076ms self 0.001ms children 0.075ms %children 98.79% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.074ms self 0.006ms children 0.068ms %children 92.52% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.068ms self 0.007ms children 0.062ms %children 90.45% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile total 37.877ms self 36.187ms children 1.689ms %children 4.46% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR total 0.231ms self 0.002ms children 0.229ms %children 99.30% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.223ms self 0.008ms children 0.214ms %children 96.22% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.214ms self 0.016ms children 0.198ms %children 92.52% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR total 0.017ms self 0.001ms children 0.016ms %children 93.59% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/LoweringTransformation total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR total 0.092ms self 0.001ms children 0.091ms %children 98.94% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.088ms self 0.006ms children 0.082ms %children 92.82% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.082ms self 0.008ms children 0.074ms %children 90.36% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.891 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.892 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.892 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.892 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs total 0.118ms self 0.001ms children 0.117ms %children 99.05% 2023-04-22 21:18:55.892 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.892 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.107ms self 0.107ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.892 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.892 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.100ms self 0.001ms children 0.099ms %children 98.89% 2023-04-22 21:18:55.892 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.892 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.096ms self 0.006ms children 0.090ms %children 93.38% 2023-04-22 21:18:55.892 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.090ms self 0.008ms children 0.082ms %children 91.04% 2023-04-22 21:18:55.892 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.904 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.904 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.904 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.904 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.904 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.904 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/EmitContext.analyze total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR total 0.212ms self 0.002ms children 0.210ms %children 99.17% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.204ms self 0.008ms children 0.195ms %children 95.95% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.195ms self 0.014ms children 0.181ms %children 92.69% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.048ms self 0.048ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR total 0.013ms self 0.001ms children 0.012ms %children 92.18% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/LoweringTransformation total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR total 0.087ms self 0.001ms children 0.086ms %children 98.97% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.084ms self 0.006ms children 0.077ms %children 92.37% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.077ms self 0.008ms children 0.069ms %children 89.92% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs total 0.070ms self 0.001ms children 0.069ms %children 98.27% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.094ms self 0.003ms children 0.092ms %children 97.33% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.090ms self 0.007ms children 0.083ms %children 92.63% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.083ms self 0.010ms children 0.073ms %children 87.59% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/EmitContext.analyze total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR total 0.207ms self 0.001ms children 0.206ms %children 99.30% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 0.200ms self 0.008ms children 0.192ms %children 96.24% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 0.192ms self 0.017ms children 0.176ms %children 91.28% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.046ms self 0.046ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.905 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.002ms self 0.002ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR total 0.013ms self 0.001ms children 0.012ms %children 92.09% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/LoweringTransformation total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR total 0.083ms self 0.001ms children 0.083ms %children 98.89% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 0.080ms self 0.006ms children 0.074ms %children 92.43% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 0.074ms self 0.007ms children 0.067ms %children 90.04% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs total 0.062ms self 0.001ms children 0.061ms %children 98.55% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.054ms self 0.054ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 0.089ms self 0.001ms children 0.088ms %children 98.81% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 0.086ms self 0.006ms children 0.080ms %children 93.06% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 0.080ms self 0.007ms children 0.072ms %children 90.69% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.004ms self 0.004ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/EmitContext.analyze total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/InitializeCompiledFunction total 1.550ms self 1.550ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/RunCompiledFunction total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate total 36.306s self 5.250ms children 36.301s %children 99.99% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR total 18.721ms self 0.003ms children 18.717ms %children 99.98% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/Verify total 12.543ms self 12.543ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation total 6.160ms self 0.017ms children 6.142ms %children 99.72% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 6.142ms self 0.042ms children 6.101ms %children 99.32% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.203ms self 0.203ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.374ms self 0.374ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.597ms self 0.597ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.436ms self 1.436ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.456ms self 0.456ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.302ms self 0.302ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.298ms self 0.298ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.050ms self 0.050ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.078ms self 0.078ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.069ms self 0.069ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.172ms self 0.172ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.209ms self 0.209ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.786ms self 0.786ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.906 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.087ms self 0.087ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.153ms self 0.153ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.667ms self 0.667ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, initial IR/Verify total 0.014ms self 0.014ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerMatrixToTable total 0.054ms self 0.001ms children 0.053ms %children 97.54% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerMatrixToTable/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerMatrixToTable/LoweringTransformation total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerMatrixToTable/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable total 6.794ms self 0.002ms children 6.792ms %children 99.97% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 6.245ms self 0.011ms children 6.233ms %children 99.82% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 6.233ms self 0.741ms children 5.492ms %children 88.12% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 1.220ms self 1.220ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.411ms self 0.411ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.272ms self 0.272ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.280ms self 0.280ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.375ms self 0.375ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.707ms self 0.707ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 1.059ms self 1.059ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.058ms self 0.058ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.075ms self 0.075ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.147ms self 0.147ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.209ms self 0.209ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 0.134ms self 0.134ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 0.172ms self 0.172ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.542ms self 0.542ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LiftRelationalValuesToRelationalLets total 0.048ms self 0.001ms children 0.046ms %children 96.89% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LiftRelationalValuesToRelationalLets/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LiftRelationalValuesToRelationalLets/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/EvalRelationalLets total 0.035ms self 0.001ms children 0.033ms %children 96.16% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/EvalRelationalLets/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/EvalRelationalLets/LoweringTransformation total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/EvalRelationalLets/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerAndExecuteShuffles total 0.038ms self 0.001ms children 0.036ms %children 96.06% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerAndExecuteShuffles/Verify total 0.008ms self 0.008ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerAndExecuteShuffles/LoweringTransformation total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerAndExecuteShuffles/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles total 14.608ms self 0.002ms children 14.606ms %children 99.99% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 14.592ms self 0.012ms children 14.580ms %children 99.92% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 14.580ms self 0.030ms children 14.550ms %children 99.79% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.024ms self 0.024ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.044ms self 0.044ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.076ms self 0.076ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.148ms self 0.148ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.184ms self 0.184ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.031ms self 0.031ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 0.130ms self 0.130ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.171ms self 0.171ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.032ms self 0.032ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 0.058ms self 0.058ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.907 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 0.062ms self 0.062ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 13.087ms self 13.087ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 0.188ms self 0.188ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerOrInterpretNonCompilable total 0.054ms self 0.001ms children 0.053ms %children 97.60% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerOrInterpretNonCompilable/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerOrInterpretNonCompilable/LoweringTransformation total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/LowerOrInterpretNonCompilable/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 1.601ms self 0.002ms children 1.599ms %children 99.88% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 1.585ms self 0.012ms children 1.573ms %children 99.26% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 1.573ms self 0.029ms children 1.544ms %children 98.15% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.043ms self 0.043ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.067ms self 0.067ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.066ms self 0.066ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.135ms self 0.135ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.179ms self 0.179ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.064ms self 0.064ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.127ms self 0.127ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.183ms self 0.183ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.065ms self 0.065ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.063ms self 0.063ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.127ms self 0.127ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.167ms self 0.167ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile total 203.693ms self 166.661ms children 37.032ms %children 18.18% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR total 1.975ms self 0.002ms children 1.973ms %children 99.88% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.006ms self 0.006ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 1.955ms self 0.016ms children 1.939ms %children 99.20% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 1.939ms self 0.035ms children 1.904ms %children 98.21% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.084ms self 0.084ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.078ms self 0.078ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.162ms self 0.162ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.232ms self 0.232ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.042ms self 0.042ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.155ms self 0.155ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.213ms self 0.213ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.084ms self 0.084ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.151ms self 0.151ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.215ms self 0.215ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR total 0.050ms self 0.002ms children 0.048ms %children 96.30% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.022ms self 0.022ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/LoweringTransformation total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR total 16.039ms self 0.002ms children 16.037ms %children 99.98% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 16.020ms self 0.016ms children 16.004ms %children 99.90% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 16.004ms self 0.035ms children 15.969ms %children 99.78% 2023-04-22 21:18:55.908 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.026ms self 0.026ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.159ms self 0.159ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.230ms self 0.230ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.042ms self 0.042ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.158ms self 0.158ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.204ms self 0.204ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.078ms self 0.078ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 14.235ms self 14.235ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.029ms self 0.029ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.235ms self 0.235ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs total 0.618ms self 0.002ms children 0.616ms %children 99.64% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.590ms self 0.590ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 1.928ms self 0.003ms children 1.926ms %children 99.87% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 1.907ms self 0.025ms children 1.883ms %children 98.71% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 1.883ms self 0.034ms children 1.849ms %children 98.22% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.028ms self 0.028ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.052ms self 0.052ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.160ms self 0.160ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.218ms self 0.218ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.078ms self 0.078ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.163ms self 0.163ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.208ms self 0.208ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.148ms self 0.148ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.221ms self 0.221ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.013ms self 0.013ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/EmitContext.analyze total 0.488ms self 0.488ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR total 1.984ms self 0.002ms children 1.982ms %children 99.90% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 1.964ms self 0.016ms children 1.949ms %children 99.19% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 1.949ms self 0.036ms children 1.913ms %children 98.16% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.060ms self 0.060ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.092ms self 0.092ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.177ms self 0.177ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.251ms self 0.251ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.042ms self 0.042ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.076ms self 0.076ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.151ms self 0.151ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.208ms self 0.208ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.909 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.051ms self 0.051ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.148ms self 0.148ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.201ms self 0.201ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR total 0.039ms self 0.002ms children 0.038ms %children 95.70% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/LoweringTransformation total 0.017ms self 0.017ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR total 3.425ms self 0.002ms children 3.423ms %children 99.94% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 3.407ms self 0.015ms children 3.392ms %children 99.57% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 3.392ms self 0.037ms children 3.355ms %children 98.90% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.052ms self 0.052ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.082ms self 0.082ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.156ms self 0.156ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 1.660ms self 1.660ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.046ms self 0.046ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.048ms self 0.048ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.169ms self 0.169ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.250ms self 0.250ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.042ms self 0.042ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.075ms self 0.075ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.152ms self 0.152ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.202ms self 0.202ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs total 0.478ms self 0.002ms children 0.476ms %children 99.54% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.452ms self 0.452ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 1.890ms self 0.002ms children 1.888ms %children 99.90% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 1.872ms self 0.015ms children 1.857ms %children 99.21% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 1.857ms self 0.034ms children 1.824ms %children 98.19% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.054ms self 0.054ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.158ms self 0.158ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.216ms self 0.216ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.149ms self 0.149ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.203ms self 0.203ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.069ms self 0.069ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.146ms self 0.146ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.212ms self 0.212ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.010ms self 0.010ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/EmitContext.analyze total 0.453ms self 0.453ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR total 1.972ms self 0.002ms children 1.969ms %children 99.89% 2023-04-22 21:18:55.910 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.007ms self 0.007ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 1.951ms self 0.016ms children 1.935ms %children 99.18% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 1.935ms self 0.034ms children 1.901ms %children 98.23% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.094ms self 0.094ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.075ms self 0.075ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.164ms self 0.164ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.020ms self 0.020ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.226ms self 0.226ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.038ms self 0.038ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.073ms self 0.073ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.082ms self 0.082ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.154ms self 0.154ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.205ms self 0.205ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 0.150ms self 0.150ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 0.223ms self 0.223ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, initial IR/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR total 0.040ms self 0.001ms children 0.038ms %children 96.35% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/LoweringTransformation total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/InlineApplyIR/Verify total 0.009ms self 0.009ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR total 2.830ms self 0.002ms children 2.828ms %children 99.92% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 2.811ms self 0.015ms children 2.796ms %children 99.46% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 2.796ms self 0.037ms children 2.759ms %children 98.67% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.036ms self 0.036ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.052ms self 0.052ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.080ms self 0.080ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.155ms self 0.155ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.023ms self 0.023ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.217ms self 0.217ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.071ms self 0.071ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.982ms self 0.982ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.033ms self 0.033ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.272ms self 0.272ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.039ms self 0.039ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.042ms self 0.042ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 0.158ms self 0.158ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 0.208ms self 0.208ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs total 0.473ms self 0.002ms children 0.471ms %children 99.57% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 0.446ms self 0.446ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/LowerArrayAggsToRunAggs/Verify total 0.012ms self 0.012ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 1.894ms self 0.002ms children 1.892ms %children 99.89% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.005ms self 0.005ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 1.875ms self 0.016ms children 1.860ms %children 99.17% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 1.860ms self 0.034ms children 1.826ms %children 98.19% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.027ms self 0.027ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.052ms self 0.052ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.084ms self 0.084ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.167ms self 0.167ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.911 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.021ms self 0.021ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.213ms self 0.213ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.037ms self 0.037ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.040ms self 0.040ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.080ms self 0.080ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.151ms self 0.151ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.018ms self 0.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.200ms self 0.200ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.035ms self 0.035ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.041ms self 0.041ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 0.070ms self 0.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 0.151ms self 0.151ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.019ms self 0.019ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 0.207ms self 0.207ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/Compile/EmitContext.analyze total 0.456ms self 0.456ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/InitializeCompiledFunction total 2.345ms self 2.345ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/CompileAndEvaluate/RunCompiledFunction total 36.053s self 36.053s children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR total 146.982ms self 0.005ms children 146.978ms %children 100.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.130ms self 0.130ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation total 146.652ms self 0.029ms children 146.623ms %children 99.98% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize total 146.623ms self 0.132ms children 146.491ms %children 99.91% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 1.533ms self 1.533ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 1.128ms self 1.128ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 4.363ms self 4.363ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 4.197ms self 4.197ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 18.628ms self 18.628ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.861ms self 0.861ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 34.877ms self 34.877ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 1.216ms self 1.216ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 23.131ms self 23.131ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.370ms self 1.370ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 2.105ms self 2.105ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 4.702ms self 4.702ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 1.023ms self 1.023ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 5.737ms self 5.737ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 1.105ms self 1.105ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.843ms self 0.843ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 14.282ms self 14.282ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.994ms self 1.994ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 4.345ms self 4.345ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.396ms self 0.396ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 18.657ms self 18.657ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, initial IR/Verify total 0.196ms self 0.196ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable total 0.707ms self 0.002ms children 0.705ms %children 99.65% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.094ms self 0.094ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/LoweringTransformation total 0.513ms self 0.513ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerMatrixToTable/Verify total 0.098ms self 0.098ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable total 80.250ms self 0.003ms children 80.248ms %children 100.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.081ms self 0.081ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation total 79.977ms self 0.040ms children 79.937ms %children 99.95% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize total 79.937ms self 0.073ms children 79.864ms %children 99.91% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.450ms self 0.450ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.830ms self 0.830ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.225ms self 1.225ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 3.235ms self 3.235ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 4.525ms self 4.525ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.396ms self 0.396ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 20.560ms self 20.560ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.975ms self 0.975ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.847ms self 0.847ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.217ms self 1.217ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 2.150ms self 2.150ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 17.480ms self 17.480ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.395ms self 0.395ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 8.070ms self 8.070ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/FoldConstants total 0.992ms self 0.992ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.849ms self 0.849ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.912 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/NormalizeNames total 1.249ms self 1.249ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/Simplify total 2.135ms self 2.135ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardLets total 5.249ms self 5.249ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.398ms self 0.398ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/LoweringTransformation/Optimize/PruneDeadFields total 6.638ms self 6.638ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerMatrixToTable/Verify total 0.189ms self 0.189ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets total 0.854ms self 0.003ms children 0.851ms %children 99.66% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.112ms self 0.112ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/LoweringTransformation total 0.636ms self 0.636ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LiftRelationalValuesToRelationalLets/Verify total 0.103ms self 0.103ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets total 0.937ms self 0.003ms children 0.934ms %children 99.69% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.106ms self 0.106ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/LoweringTransformation total 0.699ms self 0.699ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/EvalRelationalLets/Verify total 0.128ms self 0.128ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles total 8.782ms self 0.004ms children 8.778ms %children 99.95% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.158ms self 0.158ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/LoweringTransformation total 8.453ms self 8.453ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerAndExecuteShuffles/Verify total 0.167ms self 0.167ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.913 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles total 116.498ms self 0.003ms children 116.495ms %children 100.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.079ms self 0.079ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation total 116.211ms self 0.014ms children 116.196ms %children 99.99% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize total 116.196ms self 0.065ms children 116.131ms %children 99.94% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.461ms self 0.461ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.833ms self 0.833ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.270ms self 1.270ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 27.271ms self 27.271ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 25.124ms self 25.124ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.415ms self 0.415ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 5.527ms self 5.527ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 2.953ms self 2.953ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.871ms self 0.871ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 1.239ms self 1.239ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 2.182ms self 2.182ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 25.159ms self 25.159ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.425ms self 0.425ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 5.387ms self 5.387ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/FoldConstants total 0.974ms self 0.974ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.826ms self 0.826ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/NormalizeNames total 2.465ms self 2.465ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/Simplify total 2.212ms self 2.212ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardLets total 3.996ms self 3.996ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/ForwardRelationalLets total 0.408ms self 0.408ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/LoweringTransformation/Optimize/PruneDeadFields total 6.135ms self 6.135ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerAndExecuteShuffles/Verify total 0.205ms self 0.205ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable total 2.004ms self 0.003ms children 2.000ms %children 99.83% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.113ms self 0.113ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/LoweringTransformation total 1.773ms self 1.773ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/LowerOrInterpretNonCompilable/Verify total 0.115ms self 0.115ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 91.525ms self 0.002ms children 91.522ms %children 100.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.052ms self 0.052ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 91.288ms self 0.014ms children 91.273ms %children 99.98% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 91.273ms self 0.062ms children 91.211ms %children 99.93% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.441ms self 0.441ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.831ms self 0.831ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 1.227ms self 1.227ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 2.533ms self 2.533ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 4.062ms self 4.062ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.389ms self 0.389ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 14.583ms self 14.583ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.915ms self 0.915ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.860ms self 0.860ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 1.183ms self 1.183ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 2.043ms self 2.043ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 13.247ms self 13.247ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.395ms self 0.395ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 19.353ms self 19.353ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.929 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.946ms self 0.946ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.847ms self 0.847ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 1.206ms self 1.206ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 2.043ms self 2.043ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 17.144ms self 17.144ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.423ms self 0.423ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 6.540ms self 6.540ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.182ms self 0.182ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile total 2.581s self 1.555s children 1.027s %children 39.77% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 85.771ms self 0.003ms children 85.768ms %children 100.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.058ms self 0.058ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 85.536ms self 0.014ms children 85.521ms %children 99.98% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 85.521ms self 0.062ms children 85.459ms %children 99.93% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.466ms self 0.466ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.829ms self 0.829ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.203ms self 1.203ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 2.063ms self 2.063ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 17.185ms self 17.185ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.437ms self 0.437ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 5.315ms self 5.315ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.925ms self 0.925ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.837ms self 0.837ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.185ms self 1.185ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 15.113ms self 15.113ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 4.072ms self 4.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.398ms self 0.398ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 5.833ms self 5.833ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.918ms self 0.918ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.851ms self 0.851ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.180ms self 1.180ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 2.039ms self 2.039ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 18.168ms self 18.168ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.394ms self 0.394ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 6.048ms self 6.048ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.174ms self 0.174ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.503ms self 0.002ms children 0.501ms %children 99.66% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.290ms self 0.290ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.112ms self 0.112ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 82.658ms self 0.003ms children 82.656ms %children 100.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.054ms self 0.054ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 82.451ms self 0.011ms children 82.439ms %children 99.99% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 82.439ms self 0.062ms children 82.378ms %children 99.93% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.458ms self 0.458ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.827ms self 0.827ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.206ms self 1.206ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 2.039ms self 2.039ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 4.474ms self 4.474ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.400ms self 0.400ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 5.285ms self 5.285ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.912ms self 0.912ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.830ms self 0.830ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 15.282ms self 15.282ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 2.049ms self 2.049ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 4.060ms self 4.060ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.392ms self 0.392ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 5.913ms self 5.913ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 1.466ms self 1.466ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.845ms self 0.845ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.201ms self 1.201ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 2.035ms self 2.035ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 27.076ms self 27.076ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.403ms self 0.403ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 5.226ms self 5.226ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.151ms self 0.151ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 41.262ms self 0.003ms children 41.259ms %children 99.99% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.119ms self 0.119ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 41.000ms self 41.000ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.139ms self 0.139ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.930 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 86.648ms self 0.003ms children 86.646ms %children 100.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.061ms self 0.061ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 86.272ms self 0.013ms children 86.258ms %children 99.98% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 86.258ms self 0.064ms children 86.195ms %children 99.93% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.491ms self 0.491ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.873ms self 0.873ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.280ms self 1.280ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 24.913ms self 24.913ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 4.107ms self 4.107ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.402ms self 0.402ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 20.628ms self 20.628ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.992ms self 0.992ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.880ms self 0.880ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.227ms self 1.227ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 2.285ms self 2.285ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 4.058ms self 4.058ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.407ms self 0.407ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 6.619ms self 6.619ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 1.021ms self 1.021ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.884ms self 0.884ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.224ms self 1.224ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 2.117ms self 2.117ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 5.746ms self 5.746ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.424ms self 0.424ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 5.616ms self 5.616ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.313ms self 0.313ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 35.693ms self 35.693ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 97.268ms self 0.004ms children 97.264ms %children 100.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.074ms self 0.074ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 97.043ms self 0.016ms children 97.027ms %children 99.98% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 97.027ms self 0.079ms children 96.949ms %children 99.92% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.492ms self 0.492ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.955ms self 0.955ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 20.474ms self 20.474ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 2.206ms self 2.206ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 4.156ms self 4.156ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.400ms self 0.400ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 16.235ms self 16.235ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.929ms self 0.929ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.842ms self 0.842ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.193ms self 1.193ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 13.134ms self 13.134ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 4.018ms self 4.018ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.397ms self 0.397ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 6.439ms self 6.439ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.911ms self 0.911ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.836ms self 0.836ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.192ms self 1.192ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.890ms self 1.890ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 4.196ms self 4.196ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.413ms self 0.413ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 15.642ms self 15.642ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.147ms self 0.147ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.479ms self 0.002ms children 0.477ms %children 99.56% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.105ms self 0.105ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.279ms self 0.279ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.094ms self 0.094ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 102.907ms self 0.003ms children 102.905ms %children 100.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.056ms self 0.056ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 102.687ms self 0.015ms children 102.671ms %children 99.99% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 102.671ms self 0.064ms children 102.607ms %children 99.94% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.439ms self 0.439ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.823ms self 0.823ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 2.766ms self 2.766ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 3.181ms self 3.181ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 13.164ms self 13.164ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.751ms self 0.751ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 17.419ms self 17.419ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 1.104ms self 1.104ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.842ms self 0.842ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.931 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.210ms self 1.210ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 1.940ms self 1.940ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 3.906ms self 3.906ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.382ms self 0.382ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 26.357ms self 26.357ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.889ms self 0.889ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.833ms self 0.833ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.174ms self 1.174ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 1.875ms self 1.875ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 5.164ms self 5.164ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.388ms self 0.388ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 18.000ms self 18.000ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.162ms self 0.162ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 23.645ms self 0.003ms children 23.642ms %children 99.99% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.113ms self 0.113ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 23.396ms self 23.396ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.133ms self 0.133ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 111.802ms self 0.003ms children 111.798ms %children 100.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 111.592ms self 0.014ms children 111.578ms %children 99.99% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 111.578ms self 0.066ms children 111.512ms %children 99.94% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.484ms self 0.484ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.876ms self 0.876ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.564ms self 1.564ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 3.662ms self 3.662ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 14.878ms self 14.878ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.401ms self 0.401ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 14.853ms self 14.853ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 1.430ms self 1.430ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 1.216ms self 1.216ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.224ms self 1.224ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.942ms self 1.942ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 3.929ms self 3.929ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.394ms self 0.394ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 14.950ms self 14.950ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 1.130ms self 1.130ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.883ms self 0.883ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.216ms self 1.216ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.924ms self 1.924ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 3.967ms self 3.967ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 35.480ms self 35.480ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 5.108ms self 5.108ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.147ms self 0.147ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 26.206ms self 26.206ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR total 105.163ms self 0.004ms children 105.159ms %children 100.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.104ms self 0.104ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation total 104.894ms self 0.017ms children 104.877ms %children 99.98% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize total 104.877ms self 0.073ms children 104.805ms %children 99.93% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.486ms self 0.486ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 1.251ms self 1.251ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.307ms self 1.307ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.986ms self 1.986ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 13.849ms self 13.849ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.399ms self 0.399ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 13.656ms self 13.656ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.889ms self 0.889ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.837ms self 0.837ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.174ms self 1.174ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 1.894ms self 1.894ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 23.275ms self 23.275ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.431ms self 0.431ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 5.152ms self 5.152ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/FoldConstants total 0.900ms self 0.900ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.835ms self 0.835ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/NormalizeNames total 1.167ms self 1.167ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/Simplify total 20.965ms self 20.965ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardLets total 3.708ms self 3.708ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.383ms self 0.383ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/LoweringTransformation/Optimize/PruneDeadFields total 10.263ms self 10.263ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.932 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, initial IR/Verify total 0.160ms self 0.160ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR total 0.476ms self 0.002ms children 0.474ms %children 99.60% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.100ms self 0.100ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/LoweringTransformation total 0.280ms self 0.280ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/InlineApplyIR/Verify total 0.094ms self 0.094ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR total 102.305ms self 0.003ms children 102.302ms %children 100.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation total 102.094ms self 0.016ms children 102.079ms %children 99.98% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize total 102.079ms self 0.062ms children 102.017ms %children 99.94% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.489ms self 0.489ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.812ms self 0.812ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.181ms self 1.181ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 2.635ms self 2.635ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 32.842ms self 32.842ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.394ms self 0.394ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 5.089ms self 5.089ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.835ms self 0.835ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.827ms self 0.827ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.153ms self 1.153ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 4.114ms self 4.114ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 30.057ms self 30.057ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.411ms self 0.411ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 4.894ms self 4.894ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/FoldConstants total 0.815ms self 0.815ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.816ms self 0.816ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/NormalizeNames total 1.155ms self 1.155ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/Simplify total 1.869ms self 1.869ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardLets total 3.160ms self 3.160ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/ForwardRelationalLets total 0.372ms self 0.372ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/LoweringTransformation/Optimize/PruneDeadFields total 8.096ms self 8.096ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after InlineApplyIR/Verify total 0.149ms self 0.149ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs total 26.232ms self 0.003ms children 26.229ms %children 99.99% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.112ms self 0.112ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/LoweringTransformation total 25.991ms self 25.991ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/LowerArrayAggsToRunAggs/Verify total 0.127ms self 0.127ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs total 78.243ms self 0.003ms children 78.240ms %children 100.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.059ms self 0.059ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation total 78.036ms self 0.014ms children 78.021ms %children 99.98% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize total 78.021ms self 0.061ms children 77.960ms %children 99.92% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.435ms self 0.435ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.870ms self 0.870ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.267ms self 1.267ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 3.369ms self 3.369ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 7.000ms self 7.000ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.386ms self 0.386ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 7.608ms self 7.608ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.879ms self 0.879ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.852ms self 0.852ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 1.218ms self 1.218ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.904ms self 1.904ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 30.288ms self 30.288ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.401ms self 0.401ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 5.567ms self 5.567ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/FoldConstants total 0.878ms self 0.878ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.857ms self 0.857ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/NormalizeNames total 2.022ms self 2.022ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/Simplify total 1.921ms self 1.921ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardLets total 3.326ms self 3.326ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/ForwardRelationalLets total 0.394ms self 0.394ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/LoweringTransformation/Optimize/PruneDeadFields total 6.518ms self 6.518ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/optimize: compileLowerer, after LowerArrayAggsToRunAggs/Verify total 0.145ms self 0.145ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/Compile/EmitContext.analyze total 19.260ms self 19.260ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/InitializeCompiledFunction total 4.817ms self 4.817ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/LoweringTransformation/RunCompiledVoidFunction total 4m31.8s self 4m31.8s children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/LowerOrInterpretNonCompilable/Verify total 0.015ms self 0.015ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable total 0.341ms self 0.002ms children 0.339ms %children 99.35% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation total 0.333ms self 0.013ms children 0.319ms %children 96.01% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize total 0.319ms self 0.018ms children 0.302ms %children 94.41% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/FoldConstants total 0.016ms self 0.016ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.933 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ExtractIntervalFilters total 0.034ms self 0.034ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.934 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/NormalizeNames total 0.030ms self 0.030ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.934 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/Simplify total 0.055ms self 0.055ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.934 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardLets total 0.083ms self 0.083ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.934 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/ForwardRelationalLets total 0.011ms self 0.011ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.934 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/LoweringTransformation/Optimize/PruneDeadFields total 0.072ms self 0.072ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.934 : INFO: timing SparkBackend.executeEncode/optimize: relationalLowerer, after LowerOrInterpretNonCompilable/Verify total 0.003ms self 0.003ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.934 : INFO: timing SparkBackend.executeEncode/Compile total 0.086ms self 0.086ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.934 : INFO: timing SparkBackend.executeEncode/InitializeCompiledFunction total 1.280ms self 1.280ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.934 : INFO: timing SparkBackend.executeEncode/RunCompiledVoidFunction total 0.001ms self 0.001ms children 0.000ms %children 0.00% 2023-04-22 21:18:55.950 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:18:56.013 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:18:56.013 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:18:56.013 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:56.013 : INFO: RegionPool: FREE: 64.0K allocated (64.0K blocks / 0 chunks), regions.size = 1, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:56.014 : INFO: timing SparkBackend.parse_blockmatrix_ir total 64.105ms self 64.105ms children 0.000ms %children 0.00% 2023-04-22 21:18:56.025 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:18:56.182 : INFO: JSON: JObject(List((name,JString(PCRelate)), (maf,JDouble(0.05)), (blockSize,JInt(4096)), (minKinship,JNull), (statistics,JInt(0)))) 2023-04-22 21:18:56.229 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:18:56.229 : INFO: TaskReport: stage=0, partition=0, attempt=0, peakBytes=0, peakBytesReadable=0.00 B, chunks requested=0, cache hits=0 2023-04-22 21:18:56.229 : INFO: RegionPool: FREE: 0 allocated (0 blocks / 0 chunks), regions.size = 0, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:56.229 : INFO: RegionPool: FREE: 64.0K allocated (64.0K blocks / 0 chunks), regions.size = 1, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:56.229 : INFO: timing SparkBackend.parse_value_ir total 204.779ms self 204.779ms children 0.000ms %children 0.00% 2023-04-22 21:18:56.239 : INFO: RegionPool: initialized for thread 14: Thread-5 2023-04-22 21:18:56.239 : INFO: starting execution of query hail_query_8 of initial size 38 2023-04-22 21:18:56.244 : INFO: initial IR: IR size 38: (Let __rng_state (RNGStateLiteral) (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_tablemjEhQBgG4Y\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (BlockMatrixToTableApply "{\"name\":\"PCRelate\",\"maf\":0.05,\"blockSize\":4096,\"statistics\":0}" (BlockMatrixRead "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"name\":\"BlockMatrixNativeReader\"}") (ToArray (StreamMap __uid_8 (ToStream False (GetField rows (TableCollect (MatrixColsTable (MatrixMapCols None (MatrixMapCols () (MatrixMapCols None (MatrixAnnotateColsTable "__uid_4" (MatrixRead Matrix{global:Struct{},col_key:[s],col:Struct{s:String,fam_id:String,pat_id:String,mat_id:String,is_female:Boolean,is_case:Boolean},row_key:[[locus,alleles]],row:Struct{locus:Locus(GRCh38),alleles:Array[String],rsid:String,cm_position:Float64},entry:Struct{GT:Call}} False False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (TableKeyBy (s) False (TableParallelize None (MakeStruct (rows (GetField scores (TableGetGlobals (TableRead Table{global:Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],loadings:Array[Float64]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))))) (global (MakeStruct)))))) (InsertFields (SelectFields (s) (SelectFields (s fam_id pat_id mat_id is_female is_case) (Ref sa))) None (__scores (GetField scores (GetField __uid_4 (Ref sa)))))) (SelectFields (s __scores) (Ref sa))) (InsertFields (SelectFields (__scores) (SelectFields (s __scores) (Ref sa))) None)))))) (GetField __scores (Ref __uid_8))))))) 2023-04-22 21:18:56.298 : INFO: after optimize: relationalLowerer, initial IR: IR size 27: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_tablemjEhQBgG4Y\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (BlockMatrixToTableApply "{\"name\":\"PCRelate\",\"maf\":0.05,\"blockSize\":4096,\"statistics\":0}" (BlockMatrixRead "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"name\":\"BlockMatrixNativeReader\"}") (ToArray (StreamMap __iruid_4397 (ToStream False (GetField rows (TableCollect (MatrixColsTable (MatrixMapCols () (MatrixAnnotateColsTable "__uid_4" (MatrixRead Matrix{global:Struct{},col_key:[s],col:Struct{s:String},row_key:[[locus,alleles]],row:Struct{locus:Locus(GRCh38),alleles:Array[String]},entry:Struct{}} False False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (TableKeyBy (s) False (TableParallelize None (MakeStruct (rows (GetField scores (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))))) (global (Literal Struct{} )))))) (InsertFields (SelectFields () (Ref sa)) None (__scores (GetField scores (GetField __uid_4 (Ref sa)))))))))) (GetField __scores (Ref __iruid_4397)))))) 2023-04-22 21:18:56.302 : INFO: after LowerMatrixToTable: IR size 82: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_tablemjEhQBgG4Y\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (BlockMatrixToTableApply "{\"name\":\"PCRelate\",\"maf\":0.05,\"blockSize\":4096,\"statistics\":0}" (BlockMatrixRead "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"name\":\"BlockMatrixNativeReader\"}") (ToArray (StreamMap __iruid_4397 (ToStream False (GetField rows (TableCollect (TableKeyBy () False (TableParallelize None (Let __cols_and_globals (TableGetGlobals (TableMapGlobals (TableMapGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String],`the entries! [877f12a8827e18f61222c6c8c5fb04a8]`:Array[Struct{}]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"}) (Let __dictfield (ToDict (ToStream False (ToArray (StreamMap __iruid_4399 (ToStream False (GetField rows (TableCollect (TableKeyBy () False (TableKeyBy (s) False (TableParallelize None (MakeStruct (rows (GetField scores (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))))) (global (Literal Struct{} ))))))))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4399)) (SelectFields (scores) (Ref __iruid_4399))))))) (InsertFields (Ref global) None (__cols (ToArray (StreamMap __iruid_4398 (ToStream False (GetField __cols (Ref global))) (InsertFields (Ref __iruid_4398) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __dictfield) (MakeStruct (s (GetField s (Ref __iruid_4398))))))))))))) (InsertFields (Ref global) None (__cols (ToArray (StreamMap __iruid_4403 (ToStream False (ToArray (StreamRange -1 False (I32 0) (ArrayLen (GetField __cols (Ref global))) (I32 1)))) (Let __cols_array (GetField __cols (Ref global)) (Let sa (ArrayRef -1 (Ref __cols_array) (Ref __iruid_4403)) (InsertFields (SelectFields () (Ref sa)) None (__scores (GetField scores (GetField __uid_4 (Ref sa))))))))))))) (MakeStruct (rows (GetField __cols (Ref __cols_and_globals))) (global (SelectFields () (Ref __cols_and_globals)))))))))) (GetField __scores (Ref __iruid_4397)))))) 2023-04-22 21:18:56.331 : INFO: Prune: MakeStruct: eliminating field 'global' 2023-04-22 21:18:56.370 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 53: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_tablemjEhQBgG4Y\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (BlockMatrixToTableApply "{\"name\":\"PCRelate\",\"maf\":0.05,\"blockSize\":4096,\"statistics\":0}" (BlockMatrixRead "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"name\":\"BlockMatrixNativeReader\"}") (ToArray (StreamMap __iruid_4474 (ToStream False (GetField rows (Let __iruid_4475 (ToDict (StreamMap __iruid_4476 (ToStream False (GetField scores (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4476)) (SelectFields (scores) (Ref __iruid_4476))))) (Let __iruid_4477 (ToArray (StreamMap __iruid_4478 (ToStream False (GetField __cols (TableGetGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})))) (InsertFields (Ref __iruid_4478) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4475) (MakeStruct (s (GetField s (Ref __iruid_4478))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4479 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4477)) (I32 1)) (Let __iruid_4480 (ArrayRef -1 (Ref __iruid_4477) (Ref __iruid_4479)) (InsertFields (SelectFields () (Ref __iruid_4480)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4480)))))))))))))) (GetField __scores (Ref __iruid_4474)))))) 2023-04-22 21:18:56.372 : INFO: after LiftRelationalValuesToRelationalLets: IR size 57: (RelationalLet __iruid_4490 (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) (RelationalLet __iruid_4491 (TableGetGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_tablemjEhQBgG4Y\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (BlockMatrixToTableApply "{\"name\":\"PCRelate\",\"maf\":0.05,\"blockSize\":4096,\"statistics\":0}" (BlockMatrixRead "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"name\":\"BlockMatrixNativeReader\"}") (ToArray (StreamMap __iruid_4474 (ToStream False (GetField rows (Let __iruid_4475 (ToDict (StreamMap __iruid_4476 (ToStream False (GetField scores (RelationalRef __iruid_4490 Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4476)) (SelectFields (scores) (Ref __iruid_4476))))) (Let __iruid_4477 (ToArray (StreamMap __iruid_4478 (ToStream False (GetField __cols (RelationalRef __iruid_4491 Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4478) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4475) (MakeStruct (s (GetField s (Ref __iruid_4478))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4479 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4477)) (I32 1)) (Let __iruid_4480 (ArrayRef -1 (Ref __iruid_4477) (Ref __iruid_4479)) (InsertFields (SelectFields () (Ref __iruid_4480)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4480)))))))))))))) (GetField __scores (Ref __iruid_4474)))))))) 2023-04-22 21:18:56.373 : INFO: initial IR: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) 2023-04-22 21:18:56.373 : INFO: after LowerAndExecuteShuffles: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) 2023-04-22 21:18:56.374 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False (TableNativeReader /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_table20F9rYCfO1 ))) 2023-04-22 21:18:56.374 : INFO: LowerOrInterpretNonCompilable: whole stage code generation is a go! 2023-04-22 21:18:56.374 : INFO: lowering result: TableGetGlobals 2023-04-22 21:18:56.375 : INFO: compiling and evaluating result: TableGetGlobals 2023-04-22 21:18:56.395 : INFO: initial IR: IR size 9: (Let __iruid_4492 (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (MakeStruct (partitionIndex (I64 0)) (partitionPath (Str "/fg/saxena..."))))) (I32 0)) (Ref __iruid_4492)) 2023-04-22 21:18:56.397 : INFO: after optimize: relationalLowerer, initial IR: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:18:56.398 : INFO: after LowerMatrixToTable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:18:56.399 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:18:56.399 : INFO: after LiftRelationalValuesToRelationalLets: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:18:56.400 : INFO: after EvalRelationalLets: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:18:56.400 : INFO: after LowerAndExecuteShuffles: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:18:56.401 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:18:56.402 : INFO: after LowerOrInterpretNonCompilable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:18:56.403 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 5: (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0)) 2023-04-22 21:18:56.404 : INFO: initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.405 : INFO: after optimize: compileLowerer, initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.405 : INFO: after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.407 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.408 : INFO: after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.432 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.439 : INFO: encoder cache hit 2023-04-22 21:18:56.439 MemoryStore: INFO: Block broadcast_196 stored as values in memory (estimated size 216.0 B, free 25.1 GiB) 2023-04-22 21:18:56.452 MemoryStore: INFO: Block broadcast_196_piece0 stored as bytes in memory (estimated size 162.0 B, free 25.1 GiB) 2023-04-22 21:18:56.453 BlockManagerInfo: INFO: Added broadcast_196_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:18:56.453 SparkContext: INFO: Created broadcast 196 from broadcast at SparkBackend.scala:354 2023-04-22 21:18:56.454 : INFO: instruction count: 3: __C3850HailClassLoaderContainer. 2023-04-22 21:18:56.454 : INFO: instruction count: 3: __C3850HailClassLoaderContainer. 2023-04-22 21:18:56.454 : INFO: instruction count: 3: __C3852FSContainer. 2023-04-22 21:18:56.454 : INFO: instruction count: 3: __C3852FSContainer. 2023-04-22 21:18:56.457 : INFO: instruction count: 3: __C3854Compiled. 2023-04-22 21:18:56.458 : INFO: instruction count: 45: __C3854Compiled.apply 2023-04-22 21:18:56.458 : INFO: instruction count: 475: __C3854Compiled.__m3856split_ToArray 2023-04-22 21:18:56.458 : INFO: instruction count: 31: __C3854Compiled.__m3864DECODE_r_struct_of_r_array_of_r_float64ANDr_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:18:56.458 : INFO: instruction count: 22: __C3854Compiled.__m3865SKIP_r_array_of_r_float64 2023-04-22 21:18:56.458 : INFO: instruction count: 3: __C3854Compiled.__m3866SKIP_r_float64 2023-04-22 21:18:56.459 : INFO: instruction count: 58: __C3854Compiled.__m3867INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:56.459 : INFO: instruction count: 26: __C3854Compiled.__m3868INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:56.459 : INFO: instruction count: 31: __C3854Compiled.__m3869INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:18:56.459 : INFO: instruction count: 58: __C3854Compiled.__m3870INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:18:56.459 : INFO: instruction count: 10: __C3854Compiled.__m3871INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:18:56.459 : INFO: instruction count: 12: __C3854Compiled.__m3874setup_jab 2023-04-22 21:18:56.459 : INFO: instruction count: 35: __C3854Compiled.__m3879arrayref_bounds_check 2023-04-22 21:18:56.459 : INFO: instruction count: 9: __C3854Compiled.setPartitionIndex 2023-04-22 21:18:56.459 : INFO: instruction count: 4: __C3854Compiled.addPartitionRegion 2023-04-22 21:18:56.459 : INFO: instruction count: 4: __C3854Compiled.setPool 2023-04-22 21:18:56.459 : INFO: instruction count: 3: __C3854Compiled.addHailClassLoader 2023-04-22 21:18:56.459 : INFO: instruction count: 3: __C3854Compiled.addFS 2023-04-22 21:18:56.459 : INFO: instruction count: 4: __C3854Compiled.addTaskContext 2023-04-22 21:18:56.459 : INFO: instruction count: 41: __C3854Compiled.addAndDecodeLiterals 2023-04-22 21:18:56.459 : INFO: instruction count: 27: __C3854Compiled.__m3884DECODE_r_struct_of_r_struct_of_r_int64ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:18:56.459 : INFO: instruction count: 26: __C3854Compiled.__m3885INPLACE_DECODE_r_struct_of_r_int64ANDr_binaryEND_TO_r_struct_of_r_int64ANDr_stringEND 2023-04-22 21:18:56.459 : INFO: instruction count: 10: __C3854Compiled.__m3886INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:18:56.460 : INFO: initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.461 : INFO: after optimize: compileLowerer, initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.462 : INFO: after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.463 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.464 : INFO: after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.465 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.485 : INFO: encoder cache hit 2023-04-22 21:18:56.485 MemoryStore: INFO: Block broadcast_197 stored as values in memory (estimated size 216.0 B, free 25.1 GiB) 2023-04-22 21:18:56.488 MemoryStore: INFO: Block broadcast_197_piece0 stored as bytes in memory (estimated size 162.0 B, free 25.1 GiB) 2023-04-22 21:18:56.488 BlockManagerInfo: INFO: Added broadcast_197_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:18:56.489 SparkContext: INFO: Created broadcast 197 from broadcast at SparkBackend.scala:354 2023-04-22 21:18:56.489 : INFO: instruction count: 3: __C3887HailClassLoaderContainer. 2023-04-22 21:18:56.489 : INFO: instruction count: 3: __C3887HailClassLoaderContainer. 2023-04-22 21:18:56.489 : INFO: instruction count: 3: __C3889FSContainer. 2023-04-22 21:18:56.489 : INFO: instruction count: 3: __C3889FSContainer. 2023-04-22 21:18:56.493 : INFO: instruction count: 3: __C3891Compiled. 2023-04-22 21:18:56.493 : INFO: instruction count: 45: __C3891Compiled.apply 2023-04-22 21:18:56.493 : INFO: instruction count: 475: __C3891Compiled.__m3893split_ToArray 2023-04-22 21:18:56.494 : INFO: instruction count: 31: __C3891Compiled.__m3901DECODE_r_struct_of_r_array_of_r_float64ANDr_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:18:56.494 : INFO: instruction count: 22: __C3891Compiled.__m3902SKIP_r_array_of_r_float64 2023-04-22 21:18:56.494 : INFO: instruction count: 3: __C3891Compiled.__m3903SKIP_r_float64 2023-04-22 21:18:56.494 : INFO: instruction count: 58: __C3891Compiled.__m3904INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:56.494 : INFO: instruction count: 26: __C3891Compiled.__m3905INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:56.494 : INFO: instruction count: 31: __C3891Compiled.__m3906INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:18:56.494 : INFO: instruction count: 58: __C3891Compiled.__m3907INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:18:56.494 : INFO: instruction count: 10: __C3891Compiled.__m3908INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:18:56.494 : INFO: instruction count: 12: __C3891Compiled.__m3911setup_jab 2023-04-22 21:18:56.494 : INFO: instruction count: 35: __C3891Compiled.__m3916arrayref_bounds_check 2023-04-22 21:18:56.494 : INFO: instruction count: 9: __C3891Compiled.setPartitionIndex 2023-04-22 21:18:56.494 : INFO: instruction count: 4: __C3891Compiled.addPartitionRegion 2023-04-22 21:18:56.494 : INFO: instruction count: 4: __C3891Compiled.setPool 2023-04-22 21:18:56.494 : INFO: instruction count: 3: __C3891Compiled.addHailClassLoader 2023-04-22 21:18:56.494 : INFO: instruction count: 3: __C3891Compiled.addFS 2023-04-22 21:18:56.494 : INFO: instruction count: 4: __C3891Compiled.addTaskContext 2023-04-22 21:18:56.494 : INFO: instruction count: 41: __C3891Compiled.addAndDecodeLiterals 2023-04-22 21:18:56.494 : INFO: instruction count: 27: __C3891Compiled.__m3921DECODE_r_struct_of_r_struct_of_r_int64ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:18:56.494 : INFO: instruction count: 26: __C3891Compiled.__m3922INPLACE_DECODE_r_struct_of_r_int64ANDr_binaryEND_TO_r_struct_of_r_int64ANDr_stringEND 2023-04-22 21:18:56.494 : INFO: instruction count: 10: __C3891Compiled.__m3923INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:18:56.495 : INFO: initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.496 : INFO: after optimize: compileLowerer, initial IR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.497 : INFO: after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.498 : INFO: after optimize: compileLowerer, after InlineApplyIR: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.500 : INFO: after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.501 : INFO: after optimize: compileLowerer, after LowerArrayAggsToRunAggs: IR size 6: (MakeTuple (0) (ArrayRef -1 (ToArray (ReadPartition Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]} "{\"name\":\"PartitionNativeReader\",\"spec\":{\"name\":\"TypedCodecSpec\",\"_eType\":\"+EBaseStruct{eigenvalues:+EArray[+EFloat64],scores:+EArray[+EBaseStruct{s:+EBinary,scores:+EArray[+EFloat64]}]}\",\"_vType\":\"Struct{eigenvalues:Array[Float64],scores:Array[Struct{s:String,scores:Array[Float64]}]}\",\"_bufferSpec\":{\"name\":\"LEB128BufferSpec\",\"child\":{\"name\":\"BlockingBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"LZ4FastBlockBufferSpec\",\"blockSize\":32768,\"child\":{\"name\":\"StreamBlockBufferSpec\"}}}}},\"uidFieldName\":\"__row_uid\"}" (Literal Struct{partitionIndex:Int64,partitionPath:String} ))) (I32 0))) 2023-04-22 21:18:56.517 : INFO: encoder cache hit 2023-04-22 21:18:56.517 MemoryStore: INFO: Block broadcast_198 stored as values in memory (estimated size 216.0 B, free 25.1 GiB) 2023-04-22 21:18:56.518 MemoryStore: INFO: Block broadcast_198_piece0 stored as bytes in memory (estimated size 162.0 B, free 25.1 GiB) 2023-04-22 21:18:56.518 BlockManagerInfo: INFO: Added broadcast_198_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 162.0 B, free: 25.3 GiB) 2023-04-22 21:18:56.519 SparkContext: INFO: Created broadcast 198 from broadcast at SparkBackend.scala:354 2023-04-22 21:18:56.519 : INFO: instruction count: 3: __C3924HailClassLoaderContainer. 2023-04-22 21:18:56.519 : INFO: instruction count: 3: __C3924HailClassLoaderContainer. 2023-04-22 21:18:56.519 : INFO: instruction count: 3: __C3926FSContainer. 2023-04-22 21:18:56.519 : INFO: instruction count: 3: __C3926FSContainer. 2023-04-22 21:18:56.522 : INFO: instruction count: 3: __C3928Compiled. 2023-04-22 21:18:56.523 : INFO: instruction count: 45: __C3928Compiled.apply 2023-04-22 21:18:56.523 : INFO: instruction count: 475: __C3928Compiled.__m3930split_ToArray 2023-04-22 21:18:56.523 : INFO: instruction count: 31: __C3928Compiled.__m3938DECODE_r_struct_of_r_array_of_r_float64ANDr_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:18:56.523 : INFO: instruction count: 22: __C3928Compiled.__m3939SKIP_r_array_of_r_float64 2023-04-22 21:18:56.523 : INFO: instruction count: 3: __C3928Compiled.__m3940SKIP_r_float64 2023-04-22 21:18:56.523 : INFO: instruction count: 58: __C3928Compiled.__m3941INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:56.523 : INFO: instruction count: 26: __C3928Compiled.__m3942INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:56.523 : INFO: instruction count: 31: __C3928Compiled.__m3943INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:18:56.523 : INFO: instruction count: 58: __C3928Compiled.__m3944INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:18:56.523 : INFO: instruction count: 10: __C3928Compiled.__m3945INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:18:56.523 : INFO: instruction count: 12: __C3928Compiled.__m3948setup_jab 2023-04-22 21:18:56.523 : INFO: instruction count: 35: __C3928Compiled.__m3953arrayref_bounds_check 2023-04-22 21:18:56.523 : INFO: instruction count: 9: __C3928Compiled.setPartitionIndex 2023-04-22 21:18:56.523 : INFO: instruction count: 4: __C3928Compiled.addPartitionRegion 2023-04-22 21:18:56.523 : INFO: instruction count: 4: __C3928Compiled.setPool 2023-04-22 21:18:56.524 : INFO: instruction count: 3: __C3928Compiled.addHailClassLoader 2023-04-22 21:18:56.524 : INFO: instruction count: 3: __C3928Compiled.addFS 2023-04-22 21:18:56.524 : INFO: instruction count: 4: __C3928Compiled.addTaskContext 2023-04-22 21:18:56.524 : INFO: instruction count: 41: __C3928Compiled.addAndDecodeLiterals 2023-04-22 21:18:56.524 : INFO: instruction count: 27: __C3928Compiled.__m3958DECODE_r_struct_of_r_struct_of_r_int64ANDr_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:18:56.524 : INFO: instruction count: 26: __C3928Compiled.__m3959INPLACE_DECODE_r_struct_of_r_int64ANDr_binaryEND_TO_r_struct_of_r_int64ANDr_stringEND 2023-04-22 21:18:56.524 : INFO: instruction count: 10: __C3928Compiled.__m3960INPLACE_DECODE_r_int64_TO_r_int64 2023-04-22 21:18:56.604 : INFO: RegionPool: REPORT_THRESHOLD: 256.0K allocated (128.0K blocks / 128.0K chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:56.610 : INFO: RegionPool: REPORT_THRESHOLD: 512.0K allocated (320.0K blocks / 192.0K chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:56.617 : INFO: RegionPool: REPORT_THRESHOLD: 1.0M allocated (704.0K blocks / 320.0K chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:56.618 : INFO: encoder cache hit 2023-04-22 21:18:56.620 : INFO: took 246.653ms 2023-04-22 21:18:56.620 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.621 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.621 : INFO: initial IR: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.621 : INFO: after optimize: relationalLowerer, initial IR: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.621 : INFO: after LowerMatrixToTable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.621 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.621 : INFO: after LiftRelationalValuesToRelationalLets: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.621 : INFO: after EvalRelationalLets: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.621 : INFO: after LowerAndExecuteShuffles: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.621 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.621 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.621 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}) 2023-04-22 21:18:56.648 : INFO: encoder cache hit 2023-04-22 21:18:56.650 : INFO: initial IR: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) 2023-04-22 21:18:56.651 : INFO: after LowerAndExecuteShuffles: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) 2023-04-22 21:18:56.652 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 2: (TableGetGlobals (TableRead Table{global:Struct{__cols:Array[Struct{s:String}]},key:[locus,alleles],row:Struct{locus:Locus(GRCh38),alleles:Array[String]}} False {\"bed\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bed\",\"bim\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.bim\",\"fam\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/HGDP_1KG_LD.fam\",\"delimiter\":\"\\\\\\\\s+\",\"missing\":\"NA\",\"quantPheno\":false,\"a2Reference\":true,\"rg\":\"GRCh38\",\"contigRecoding\":{\"12\":\"chr12\",\"8\":\"chr8\",\"19\":\"chr19\",\"23\":\"chrX\",\"4\":\"chr4\",\"15\":\"chr15\",\"11\":\"chr11\",\"9\":\"chr9\",\"22\":\"chr22\",\"26\":\"chrM\",\"13\":\"chr13\",\"24\":\"chrY\",\"16\":\"chr16\",\"5\":\"chr5\",\"10\":\"chr10\",\"21\":\"chr21\",\"6\":\"chr6\",\"1\":\"chr1\",\"17\":\"chr17\",\"25\":\"chrX\",\"14\":\"chr14\",\"20\":\"chr20\",\"2\":\"chr2\",\"18\":\"chr18\",\"7\":\"chr7\",\"3\":\"chr3\"},\"skipInvalidLoci\":false,\"name\":\"MatrixPLINKReader\"})) 2023-04-22 21:18:56.652 : INFO: LowerOrInterpretNonCompilable: whole stage code generation is a go! 2023-04-22 21:18:56.652 : INFO: lowering result: TableGetGlobals 2023-04-22 21:18:56.654 MemoryStore: INFO: Block broadcast_199 stored as values in memory (estimated size 47.3 MiB, free 25.1 GiB) 2023-04-22 21:18:57.338 MemoryStore: INFO: Block broadcast_199_piece0 stored as bytes in memory (estimated size 2.3 MiB, free 25.1 GiB) 2023-04-22 21:18:57.338 BlockManagerInfo: INFO: Added broadcast_199_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 2.3 MiB, free: 25.3 GiB) 2023-04-22 21:18:57.339 SparkContext: INFO: Created broadcast 199 from broadcast at SparkBackend.scala:354 2023-04-22 21:18:57.341 : INFO: compiling and evaluating result: TableGetGlobals 2023-04-22 21:18:57.341 : INFO: initial IR: IR size 3: (Let __iruid_4495 (Literal Struct{__cols:Array[Struct{s:String}]} ) (Ref __iruid_4495)) 2023-04-22 21:18:57.341 : INFO: after optimize: relationalLowerer, initial IR: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:18:57.341 : INFO: after LowerMatrixToTable: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:18:57.341 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:18:57.342 : INFO: after LiftRelationalValuesToRelationalLets: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:18:57.342 : INFO: after EvalRelationalLets: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:18:57.342 : INFO: after LowerAndExecuteShuffles: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:18:57.342 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:18:57.342 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:18:57.342 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (Literal Struct{__cols:Array[Struct{s:String}]} ) 2023-04-22 21:18:57.385 : INFO: encoder cache hit 2023-04-22 21:18:57.386 : INFO: took 733.937ms 2023-04-22 21:18:57.386 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.386 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.386 : INFO: initial IR: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.386 : INFO: after optimize: relationalLowerer, initial IR: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.386 : INFO: after LowerMatrixToTable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.386 : INFO: after optimize: relationalLowerer, after LowerMatrixToTable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.386 : INFO: after LiftRelationalValuesToRelationalLets: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.386 : INFO: after EvalRelationalLets: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.386 : INFO: after LowerAndExecuteShuffles: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.387 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.387 : INFO: after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.387 : INFO: after optimize: relationalLowerer, after LowerOrInterpretNonCompilable: IR size 1: (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}) 2023-04-22 21:18:57.391 : INFO: encoder cache hit 2023-04-22 21:18:57.402 : INFO: after EvalRelationalLets: IR size 51: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_tablemjEhQBgG4Y\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (BlockMatrixToTableApply "{\"name\":\"PCRelate\",\"maf\":0.05,\"blockSize\":4096,\"statistics\":0}" (BlockMatrixRead "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"name\":\"BlockMatrixNativeReader\"}") (ToArray (StreamMap __iruid_4474 (ToStream False (GetField rows (Let __iruid_4475 (ToDict (StreamMap __iruid_4476 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4476)) (SelectFields (scores) (Ref __iruid_4476))))) (Let __iruid_4477 (ToArray (StreamMap __iruid_4478 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4478) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4475) (MakeStruct (s (GetField s (Ref __iruid_4478))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4479 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4477)) (I32 1)) (Let __iruid_4480 (ArrayRef -1 (Ref __iruid_4477) (Ref __iruid_4479)) (InsertFields (SelectFields () (Ref __iruid_4480)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4480)))))))))))))) (GetField __scores (Ref __iruid_4474)))))) 2023-04-22 21:18:57.404 : INFO: after LowerAndExecuteShuffles: IR size 51: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_tablemjEhQBgG4Y\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (BlockMatrixToTableApply "{\"name\":\"PCRelate\",\"maf\":0.05,\"blockSize\":4096,\"statistics\":0}" (BlockMatrixRead "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"name\":\"BlockMatrixNativeReader\"}") (ToArray (StreamMap __iruid_4474 (ToStream False (GetField rows (Let __iruid_4475 (ToDict (StreamMap __iruid_4476 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4476)) (SelectFields (scores) (Ref __iruid_4476))))) (Let __iruid_4477 (ToArray (StreamMap __iruid_4478 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4478) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4475) (MakeStruct (s (GetField s (Ref __iruid_4478))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4479 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4477)) (I32 1)) (Let __iruid_4480 (ArrayRef -1 (Ref __iruid_4477) (Ref __iruid_4479)) (InsertFields (SelectFields () (Ref __iruid_4480)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4480)))))))))))))) (GetField __scores (Ref __iruid_4474)))))) 2023-04-22 21:18:57.442 : INFO: after optimize: relationalLowerer, after LowerAndExecuteShuffles: IR size 51: (TableWrite "{\"name\":\"TableNativeWriter\",\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/persist_tablemjEhQBgG4Y\",\"overwrite\":false,\"stageLocally\":false,\"codecSpecJSONStr\":\"{\\n \\\"name\\\": \\\"LEB128BufferSpec\\\",\\n \\\"child\\\": {\\n \\\"name\\\": \\\"BlockingBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"LZ4FastBlockBufferSpec\\\",\\n \\\"blockSize\\\": 32768,\\n \\\"child\\\": {\\n \\\"name\\\": \\\"StreamBlockBufferSpec\\\"\\n }\\n }\\n }\\n}\"}" (BlockMatrixToTableApply "{\"name\":\"PCRelate\",\"maf\":0.05,\"blockSize\":4096,\"statistics\":0}" (BlockMatrixRead "{\"path\":\"/fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/UwaCOvX5T5QjBkSyEGmuOX\",\"name\":\"BlockMatrixNativeReader\"}") (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4540) (MakeStruct (s (GetField s (Ref __iruid_4543))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539)))))) 2023-04-22 21:18:57.443 : INFO: LowerOrInterpretNonCompilable: cannot efficiently lower query: BlockMatrixToTableApply 2023-04-22 21:18:57.443 : INFO: interpreting non-compilable result: TableWrite 2023-04-22 21:18:57.519 : INFO: initial IR: IR size 48: (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4540) (MakeStruct (s (GetField s (Ref __iruid_4543))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539)))) 2023-04-22 21:18:57.519 : INFO: after LowerMatrixToTable: IR size 48: (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4540) (MakeStruct (s (GetField s (Ref __iruid_4543))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539)))) 2023-04-22 21:18:57.520 : INFO: after LiftRelationalValuesToRelationalLets: IR size 48: (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4540) (MakeStruct (s (GetField s (Ref __iruid_4543))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539)))) 2023-04-22 21:18:57.521 : INFO: after EvalRelationalLets: IR size 48: (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4540) (MakeStruct (s (GetField s (Ref __iruid_4543))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539)))) 2023-04-22 21:18:57.522 : INFO: after LowerAndExecuteShuffles: IR size 48: (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4540) (MakeStruct (s (GetField s (Ref __iruid_4543))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539)))) 2023-04-22 21:18:57.523 : INFO: after LowerOrInterpretNonCompilable: IR size 48: (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4540) (MakeStruct (s (GetField s (Ref __iruid_4543))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539)))) 2023-04-22 21:18:57.544 : INFO: initial IR: IR size 49: (MakeTuple (0) (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4540) (MakeStruct (s (GetField s (Ref __iruid_4543))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539))))) 2023-04-22 21:18:57.545 : INFO: after InlineApplyIR: IR size 79: (MakeTuple (0) (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (Let __iruid_4552 (Ref __iruid_4540) (Let __iruid_4553 (MakeStruct (s (GetField s (Ref __iruid_4543)))) (If (IsNA (Ref __iruid_4552)) (NA Struct{scores:Array[Float64]}) (Let __iruid_4554 (LowerBoundOnOrderedCollection True (Ref __iruid_4552) (Ref __iruid_4553)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_4554) (ArrayLen (CastToArray (Ref __iruid_4552)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (Ref __iruid_4553)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (NA Struct{scores:Array[Float64]}))))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539))))) 2023-04-22 21:18:57.548 : INFO: after LowerArrayAggsToRunAggs: IR size 79: (MakeTuple (0) (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (Let __iruid_4552 (Ref __iruid_4540) (Let __iruid_4553 (MakeStruct (s (GetField s (Ref __iruid_4543)))) (If (IsNA (Ref __iruid_4552)) (NA Struct{scores:Array[Float64]}) (Let __iruid_4554 (LowerBoundOnOrderedCollection True (Ref __iruid_4552) (Ref __iruid_4553)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_4554) (ArrayLen (CastToArray (Ref __iruid_4552)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (Ref __iruid_4553)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (NA Struct{scores:Array[Float64]}))))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539))))) 2023-04-22 21:18:57.585 : INFO: encoder cache hit 2023-04-22 21:18:57.586 MemoryStore: INFO: Block broadcast_200 stored as values in memory (estimated size 435.4 KiB, free 25.1 GiB) 2023-04-22 21:18:57.590 MemoryStore: INFO: Block broadcast_200_piece0 stored as bytes in memory (estimated size 368.6 KiB, free 25.1 GiB) 2023-04-22 21:18:57.590 BlockManagerInfo: INFO: Added broadcast_200_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 368.6 KiB, free: 25.3 GiB) 2023-04-22 21:18:57.591 SparkContext: INFO: Created broadcast 200 from broadcast at SparkBackend.scala:354 2023-04-22 21:18:57.591 : INFO: instruction count: 3: __C3961HailClassLoaderContainer. 2023-04-22 21:18:57.591 : INFO: instruction count: 3: __C3961HailClassLoaderContainer. 2023-04-22 21:18:57.591 : INFO: instruction count: 3: __C3963FSContainer. 2023-04-22 21:18:57.591 : INFO: instruction count: 3: __C3963FSContainer. 2023-04-22 21:18:57.618 : INFO: instruction count: 3: __C3965Compiled. 2023-04-22 21:18:57.618 : INFO: instruction count: 31: __C3965Compiled.apply 2023-04-22 21:18:57.619 : INFO: instruction count: 259: __C3965Compiled.__m3967split_ToArray 2023-04-22 21:18:57.619 : INFO: instruction count: 377: __C3965Compiled.__m3969split_ToDict 2023-04-22 21:18:57.619 : INFO: instruction count: 12: __C3965Compiled.__m3979setup_jab 2023-04-22 21:18:57.619 : INFO: instruction count: 202: __C3965Compiled.__m3982arraySorter_outer 2023-04-22 21:18:57.619 : INFO: instruction count: 93: __C3965Compiled.__m3983arraySorter_merge 2023-04-22 21:18:57.619 : INFO: instruction count: 11: __C3965Compiled.__m3986ord_lt 2023-04-22 21:18:57.619 : INFO: instruction count: 52: __C3965Compiled.__m3987ord_ltNonnull 2023-04-22 21:18:57.619 : INFO: instruction count: 11: __C3965Compiled.__m3988ord_lt 2023-04-22 21:18:57.619 : INFO: instruction count: 21: __C3965Compiled.__m3989ord_ltNonnull 2023-04-22 21:18:57.619 : INFO: instruction count: 9: __C3965Compiled.__m3990ord_compareNonnull 2023-04-22 21:18:57.620 : INFO: instruction count: 89: __C3965Compiled.__m3991ord_compareNonnull 2023-04-22 21:18:57.620 : INFO: instruction count: 11: __C3965Compiled.__m3992ord_equiv 2023-04-22 21:18:57.620 : INFO: instruction count: 21: __C3965Compiled.__m3993ord_equivNonnull 2023-04-22 21:18:57.620 : INFO: instruction count: 36: __C3965Compiled.__m3994arraySorter_splitMerge 2023-04-22 21:18:57.620 : INFO: instruction count: 264: __C3965Compiled.__m3995distinctFromSorted 2023-04-22 21:18:57.620 : INFO: instruction count: 30: __C3965Compiled.__m3998ord_equiv 2023-04-22 21:18:57.620 : INFO: instruction count: 39: __C3965Compiled.__m3999ord_equivNonnull 2023-04-22 21:18:57.620 : INFO: instruction count: 223: __C3965Compiled.__m4007split_ToArray 2023-04-22 21:18:57.620 : INFO: instruction count: 146: __C3965Compiled.__m4018split_Let 2023-04-22 21:18:57.620 : INFO: instruction count: 60: __C3965Compiled.__m4021findElt 2023-04-22 21:18:57.620 : INFO: instruction count: 11: __C3965Compiled.__m4022ord_lt 2023-04-22 21:18:57.620 : INFO: instruction count: 44: __C3965Compiled.__m4023ord_ltNonnull 2023-04-22 21:18:57.621 : INFO: instruction count: 11: __C3965Compiled.__m4025ord_equiv 2023-04-22 21:18:57.621 : INFO: instruction count: 14: __C3965Compiled.__m4026ord_equivNonnull 2023-04-22 21:18:57.621 : INFO: instruction count: 35: __C3965Compiled.__m4027arrayref_bounds_check 2023-04-22 21:18:57.621 : INFO: instruction count: 11: __C3965Compiled.__m4028ord_equiv 2023-04-22 21:18:57.621 : INFO: instruction count: 31: __C3965Compiled.__m4029ord_equivNonnull 2023-04-22 21:18:57.621 : INFO: instruction count: 255: __C3965Compiled.__m4038split_ToArray 2023-04-22 21:18:57.621 : INFO: instruction count: 9: __C3965Compiled.setPartitionIndex 2023-04-22 21:18:57.621 : INFO: instruction count: 4: __C3965Compiled.addPartitionRegion 2023-04-22 21:18:57.621 : INFO: instruction count: 4: __C3965Compiled.setPool 2023-04-22 21:18:57.621 : INFO: instruction count: 3: __C3965Compiled.addHailClassLoader 2023-04-22 21:18:57.621 : INFO: instruction count: 3: __C3965Compiled.addFS 2023-04-22 21:18:57.621 : INFO: instruction count: 4: __C3965Compiled.addTaskContext 2023-04-22 21:18:57.621 : INFO: instruction count: 96: __C3965Compiled.addAndDecodeLiterals 2023-04-22 21:18:57.621 : INFO: instruction count: 18: __C3965Compiled.__m4062DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:18:57.621 : INFO: instruction count: 27: __C3965Compiled.__m4063DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:18:57.621 : INFO: instruction count: 58: __C3965Compiled.__m4064INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:18:57.621 : INFO: instruction count: 17: __C3965Compiled.__m4065INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:18:57.621 : INFO: instruction count: 31: __C3965Compiled.__m4066INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:18:57.621 : INFO: instruction count: 27: __C3965Compiled.__m4067DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:18:57.621 : INFO: instruction count: 58: __C3965Compiled.__m4068INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:57.621 : INFO: instruction count: 26: __C3965Compiled.__m4069INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:57.622 : INFO: instruction count: 58: __C3965Compiled.__m4070INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:18:57.622 : INFO: instruction count: 10: __C3965Compiled.__m4071INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:18:57.622 : INFO: initial IR: IR size 49: (MakeTuple (0) (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4540) (MakeStruct (s (GetField s (Ref __iruid_4543))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539))))) 2023-04-22 21:18:57.623 : INFO: after InlineApplyIR: IR size 79: (MakeTuple (0) (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (Let __iruid_4552 (Ref __iruid_4540) (Let __iruid_4553 (MakeStruct (s (GetField s (Ref __iruid_4543)))) (If (IsNA (Ref __iruid_4552)) (NA Struct{scores:Array[Float64]}) (Let __iruid_4554 (LowerBoundOnOrderedCollection True (Ref __iruid_4552) (Ref __iruid_4553)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_4554) (ArrayLen (CastToArray (Ref __iruid_4552)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (Ref __iruid_4553)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (NA Struct{scores:Array[Float64]}))))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539))))) 2023-04-22 21:18:57.637 : INFO: after LowerArrayAggsToRunAggs: IR size 79: (MakeTuple (0) (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (Let __iruid_4552 (Ref __iruid_4540) (Let __iruid_4553 (MakeStruct (s (GetField s (Ref __iruid_4543)))) (If (IsNA (Ref __iruid_4552)) (NA Struct{scores:Array[Float64]}) (Let __iruid_4554 (LowerBoundOnOrderedCollection True (Ref __iruid_4552) (Ref __iruid_4553)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_4554) (ArrayLen (CastToArray (Ref __iruid_4552)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (Ref __iruid_4553)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (NA Struct{scores:Array[Float64]}))))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539))))) 2023-04-22 21:18:57.685 : INFO: encoder cache hit 2023-04-22 21:18:57.685 MemoryStore: INFO: Block broadcast_201 stored as values in memory (estimated size 435.4 KiB, free 25.1 GiB) 2023-04-22 21:18:57.688 MemoryStore: INFO: Block broadcast_201_piece0 stored as bytes in memory (estimated size 368.6 KiB, free 25.1 GiB) 2023-04-22 21:18:57.689 BlockManagerInfo: INFO: Added broadcast_201_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 368.6 KiB, free: 25.3 GiB) 2023-04-22 21:18:57.702 SparkContext: INFO: Created broadcast 201 from broadcast at SparkBackend.scala:354 2023-04-22 21:18:57.702 : INFO: instruction count: 3: __C4072HailClassLoaderContainer. 2023-04-22 21:18:57.702 : INFO: instruction count: 3: __C4072HailClassLoaderContainer. 2023-04-22 21:18:57.702 : INFO: instruction count: 3: __C4074FSContainer. 2023-04-22 21:18:57.702 : INFO: instruction count: 3: __C4074FSContainer. 2023-04-22 21:18:57.715 : INFO: instruction count: 3: __C4076Compiled. 2023-04-22 21:18:57.716 : INFO: instruction count: 31: __C4076Compiled.apply 2023-04-22 21:18:57.716 : INFO: instruction count: 259: __C4076Compiled.__m4078split_ToArray 2023-04-22 21:18:57.716 : INFO: instruction count: 377: __C4076Compiled.__m4080split_ToDict 2023-04-22 21:18:57.716 : INFO: instruction count: 12: __C4076Compiled.__m4090setup_jab 2023-04-22 21:18:57.716 : INFO: instruction count: 202: __C4076Compiled.__m4093arraySorter_outer 2023-04-22 21:18:57.716 : INFO: instruction count: 93: __C4076Compiled.__m4094arraySorter_merge 2023-04-22 21:18:57.716 : INFO: instruction count: 11: __C4076Compiled.__m4097ord_lt 2023-04-22 21:18:57.717 : INFO: instruction count: 52: __C4076Compiled.__m4098ord_ltNonnull 2023-04-22 21:18:57.717 : INFO: instruction count: 11: __C4076Compiled.__m4099ord_lt 2023-04-22 21:18:57.717 : INFO: instruction count: 21: __C4076Compiled.__m4100ord_ltNonnull 2023-04-22 21:18:57.717 : INFO: instruction count: 9: __C4076Compiled.__m4101ord_compareNonnull 2023-04-22 21:18:57.717 : INFO: instruction count: 89: __C4076Compiled.__m4102ord_compareNonnull 2023-04-22 21:18:57.717 : INFO: instruction count: 11: __C4076Compiled.__m4103ord_equiv 2023-04-22 21:18:57.717 : INFO: instruction count: 21: __C4076Compiled.__m4104ord_equivNonnull 2023-04-22 21:18:57.717 : INFO: instruction count: 36: __C4076Compiled.__m4105arraySorter_splitMerge 2023-04-22 21:18:57.717 : INFO: instruction count: 264: __C4076Compiled.__m4106distinctFromSorted 2023-04-22 21:18:57.717 : INFO: instruction count: 30: __C4076Compiled.__m4109ord_equiv 2023-04-22 21:18:57.717 : INFO: instruction count: 39: __C4076Compiled.__m4110ord_equivNonnull 2023-04-22 21:18:57.717 : INFO: instruction count: 223: __C4076Compiled.__m4118split_ToArray 2023-04-22 21:18:57.718 : INFO: instruction count: 146: __C4076Compiled.__m4129split_Let 2023-04-22 21:18:57.718 : INFO: instruction count: 60: __C4076Compiled.__m4132findElt 2023-04-22 21:18:57.718 : INFO: instruction count: 11: __C4076Compiled.__m4133ord_lt 2023-04-22 21:18:57.718 : INFO: instruction count: 44: __C4076Compiled.__m4134ord_ltNonnull 2023-04-22 21:18:57.718 : INFO: instruction count: 11: __C4076Compiled.__m4136ord_equiv 2023-04-22 21:18:57.718 : INFO: instruction count: 14: __C4076Compiled.__m4137ord_equivNonnull 2023-04-22 21:18:57.718 : INFO: instruction count: 35: __C4076Compiled.__m4138arrayref_bounds_check 2023-04-22 21:18:57.718 : INFO: instruction count: 11: __C4076Compiled.__m4139ord_equiv 2023-04-22 21:18:57.718 : INFO: instruction count: 31: __C4076Compiled.__m4140ord_equivNonnull 2023-04-22 21:18:57.718 : INFO: instruction count: 255: __C4076Compiled.__m4149split_ToArray 2023-04-22 21:18:57.718 : INFO: instruction count: 9: __C4076Compiled.setPartitionIndex 2023-04-22 21:18:57.718 : INFO: instruction count: 4: __C4076Compiled.addPartitionRegion 2023-04-22 21:18:57.718 : INFO: instruction count: 4: __C4076Compiled.setPool 2023-04-22 21:18:57.718 : INFO: instruction count: 3: __C4076Compiled.addHailClassLoader 2023-04-22 21:18:57.718 : INFO: instruction count: 3: __C4076Compiled.addFS 2023-04-22 21:18:57.718 : INFO: instruction count: 4: __C4076Compiled.addTaskContext 2023-04-22 21:18:57.718 : INFO: instruction count: 96: __C4076Compiled.addAndDecodeLiterals 2023-04-22 21:18:57.718 : INFO: instruction count: 18: __C4076Compiled.__m4173DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:18:57.718 : INFO: instruction count: 27: __C4076Compiled.__m4174DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:18:57.718 : INFO: instruction count: 58: __C4076Compiled.__m4175INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:18:57.718 : INFO: instruction count: 17: __C4076Compiled.__m4176INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:18:57.718 : INFO: instruction count: 31: __C4076Compiled.__m4177INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:18:57.719 : INFO: instruction count: 27: __C4076Compiled.__m4178DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:18:57.719 : INFO: instruction count: 58: __C4076Compiled.__m4179INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:57.719 : INFO: instruction count: 26: __C4076Compiled.__m4180INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:57.719 : INFO: instruction count: 58: __C4076Compiled.__m4181INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:18:57.719 : INFO: instruction count: 10: __C4076Compiled.__m4182INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:18:57.720 : INFO: initial IR: IR size 49: (MakeTuple (0) (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (ApplyIR -1 get () Struct{scores:Array[Float64]} (Ref __iruid_4540) (MakeStruct (s (GetField s (Ref __iruid_4543))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539))))) 2023-04-22 21:18:57.730 : INFO: after InlineApplyIR: IR size 79: (MakeTuple (0) (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (Let __iruid_4552 (Ref __iruid_4540) (Let __iruid_4553 (MakeStruct (s (GetField s (Ref __iruid_4543)))) (If (IsNA (Ref __iruid_4552)) (NA Struct{scores:Array[Float64]}) (Let __iruid_4554 (LowerBoundOnOrderedCollection True (Ref __iruid_4552) (Ref __iruid_4553)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_4554) (ArrayLen (CastToArray (Ref __iruid_4552)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (Ref __iruid_4553)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (NA Struct{scores:Array[Float64]}))))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539))))) 2023-04-22 21:18:57.733 : INFO: after LowerArrayAggsToRunAggs: IR size 79: (MakeTuple (0) (ToArray (StreamMap __iruid_4539 (ToStream False (GetField rows (Let __iruid_4540 (ToDict (StreamMap __iruid_4541 (ToStream False (GetField scores (EncodedLiteral Struct{scores:Array[Struct{s:String,scores:Array[Float64]}]}))) (MakeTuple (0 1) (SelectFields (s) (Ref __iruid_4541)) (SelectFields (scores) (Ref __iruid_4541))))) (Let __iruid_4542 (ToArray (StreamMap __iruid_4543 (ToStream False (GetField __cols (EncodedLiteral Struct{__cols:Array[Struct{s:String}]}))) (InsertFields (Ref __iruid_4543) None (__uid_4 (Let __iruid_4552 (Ref __iruid_4540) (Let __iruid_4553 (MakeStruct (s (GetField s (Ref __iruid_4543)))) (If (IsNA (Ref __iruid_4552)) (NA Struct{scores:Array[Float64]}) (Let __iruid_4554 (LowerBoundOnOrderedCollection True (Ref __iruid_4552) (Ref __iruid_4553)) (If (ApplyComparisonOp EQWithNA (Ref __iruid_4554) (ArrayLen (CastToArray (Ref __iruid_4552)))) (NA Struct{scores:Array[Float64]}) (If (ApplyComparisonOp EQWithNA (GetField key (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (Ref __iruid_4553)) (GetField value (ArrayRef -1 (CastToArray (Ref __iruid_4552)) (Ref __iruid_4554))) (NA Struct{scores:Array[Float64]}))))))))))) (MakeStruct (rows (ToArray (StreamMap __iruid_4544 (StreamRange -1 False (I32 0) (ArrayLen (Ref __iruid_4542)) (I32 1)) (Let __iruid_4545 (ArrayRef -1 (Ref __iruid_4542) (Ref __iruid_4544)) (InsertFields (SelectFields () (Ref __iruid_4545)) None (__scores (GetField scores (GetField __uid_4 (Ref __iruid_4545)))))))))))))) (GetField __scores (Ref __iruid_4539))))) 2023-04-22 21:18:57.784 : INFO: encoder cache hit 2023-04-22 21:18:57.784 MemoryStore: INFO: Block broadcast_202 stored as values in memory (estimated size 435.4 KiB, free 25.1 GiB) 2023-04-22 21:18:57.788 MemoryStore: INFO: Block broadcast_202_piece0 stored as bytes in memory (estimated size 368.6 KiB, free 25.1 GiB) 2023-04-22 21:18:57.789 BlockManagerInfo: INFO: Added broadcast_202_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 368.6 KiB, free: 25.3 GiB) 2023-04-22 21:18:57.790 SparkContext: INFO: Created broadcast 202 from broadcast at SparkBackend.scala:354 2023-04-22 21:18:57.790 : INFO: instruction count: 3: __C4183HailClassLoaderContainer. 2023-04-22 21:18:57.790 : INFO: instruction count: 3: __C4183HailClassLoaderContainer. 2023-04-22 21:18:57.790 : INFO: instruction count: 3: __C4185FSContainer. 2023-04-22 21:18:57.790 : INFO: instruction count: 3: __C4185FSContainer. 2023-04-22 21:18:57.819 : INFO: instruction count: 3: __C4187Compiled. 2023-04-22 21:18:57.819 : INFO: instruction count: 31: __C4187Compiled.apply 2023-04-22 21:18:57.819 : INFO: instruction count: 259: __C4187Compiled.__m4189split_ToArray 2023-04-22 21:18:57.819 : INFO: instruction count: 377: __C4187Compiled.__m4191split_ToDict 2023-04-22 21:18:57.819 : INFO: instruction count: 12: __C4187Compiled.__m4201setup_jab 2023-04-22 21:18:57.820 : INFO: instruction count: 202: __C4187Compiled.__m4204arraySorter_outer 2023-04-22 21:18:57.820 : INFO: instruction count: 93: __C4187Compiled.__m4205arraySorter_merge 2023-04-22 21:18:57.820 : INFO: instruction count: 11: __C4187Compiled.__m4208ord_lt 2023-04-22 21:18:57.820 : INFO: instruction count: 52: __C4187Compiled.__m4209ord_ltNonnull 2023-04-22 21:18:57.820 : INFO: instruction count: 11: __C4187Compiled.__m4210ord_lt 2023-04-22 21:18:57.820 : INFO: instruction count: 21: __C4187Compiled.__m4211ord_ltNonnull 2023-04-22 21:18:57.820 : INFO: instruction count: 9: __C4187Compiled.__m4212ord_compareNonnull 2023-04-22 21:18:57.820 : INFO: instruction count: 89: __C4187Compiled.__m4213ord_compareNonnull 2023-04-22 21:18:57.820 : INFO: instruction count: 11: __C4187Compiled.__m4214ord_equiv 2023-04-22 21:18:57.820 : INFO: instruction count: 21: __C4187Compiled.__m4215ord_equivNonnull 2023-04-22 21:18:57.820 : INFO: instruction count: 36: __C4187Compiled.__m4216arraySorter_splitMerge 2023-04-22 21:18:57.820 : INFO: instruction count: 264: __C4187Compiled.__m4217distinctFromSorted 2023-04-22 21:18:57.821 : INFO: instruction count: 30: __C4187Compiled.__m4220ord_equiv 2023-04-22 21:18:57.821 : INFO: instruction count: 39: __C4187Compiled.__m4221ord_equivNonnull 2023-04-22 21:18:57.821 : INFO: instruction count: 223: __C4187Compiled.__m4229split_ToArray 2023-04-22 21:18:57.821 : INFO: instruction count: 146: __C4187Compiled.__m4240split_Let 2023-04-22 21:18:57.821 : INFO: instruction count: 60: __C4187Compiled.__m4243findElt 2023-04-22 21:18:57.821 : INFO: instruction count: 11: __C4187Compiled.__m4244ord_lt 2023-04-22 21:18:57.821 : INFO: instruction count: 44: __C4187Compiled.__m4245ord_ltNonnull 2023-04-22 21:18:57.821 : INFO: instruction count: 11: __C4187Compiled.__m4247ord_equiv 2023-04-22 21:18:57.821 : INFO: instruction count: 14: __C4187Compiled.__m4248ord_equivNonnull 2023-04-22 21:18:57.821 : INFO: instruction count: 35: __C4187Compiled.__m4249arrayref_bounds_check 2023-04-22 21:18:57.821 : INFO: instruction count: 11: __C4187Compiled.__m4250ord_equiv 2023-04-22 21:18:57.821 : INFO: instruction count: 31: __C4187Compiled.__m4251ord_equivNonnull 2023-04-22 21:18:57.821 : INFO: instruction count: 255: __C4187Compiled.__m4260split_ToArray 2023-04-22 21:18:57.821 : INFO: instruction count: 9: __C4187Compiled.setPartitionIndex 2023-04-22 21:18:57.821 : INFO: instruction count: 4: __C4187Compiled.addPartitionRegion 2023-04-22 21:18:57.822 : INFO: instruction count: 4: __C4187Compiled.setPool 2023-04-22 21:18:57.822 : INFO: instruction count: 3: __C4187Compiled.addHailClassLoader 2023-04-22 21:18:57.822 : INFO: instruction count: 3: __C4187Compiled.addFS 2023-04-22 21:18:57.822 : INFO: instruction count: 4: __C4187Compiled.addTaskContext 2023-04-22 21:18:57.822 : INFO: instruction count: 96: __C4187Compiled.addAndDecodeLiterals 2023-04-22 21:18:57.822 : INFO: instruction count: 18: __C4187Compiled.__m4284DECODE_r_struct_of_END_TO_SBaseStructPointer 2023-04-22 21:18:57.822 : INFO: instruction count: 27: __C4187Compiled.__m4285DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryENDEND_TO_SBaseStructPointer 2023-04-22 21:18:57.822 : INFO: instruction count: 58: __C4187Compiled.__m4286INPLACE_DECODE_r_array_of_r_struct_of_r_binaryEND_TO_r_array_of_r_struct_of_r_stringEND 2023-04-22 21:18:57.822 : INFO: instruction count: 17: __C4187Compiled.__m4287INPLACE_DECODE_r_struct_of_r_binaryEND_TO_r_struct_of_r_stringEND 2023-04-22 21:18:57.822 : INFO: instruction count: 31: __C4187Compiled.__m4288INPLACE_DECODE_r_binary_TO_r_string 2023-04-22 21:18:57.822 : INFO: instruction count: 27: __C4187Compiled.__m4289DECODE_r_struct_of_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64ENDEND_TO_SBaseStructPointer 2023-04-22 21:18:57.822 : INFO: instruction count: 58: __C4187Compiled.__m4290INPLACE_DECODE_r_array_of_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_array_of_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:57.822 : INFO: instruction count: 26: __C4187Compiled.__m4291INPLACE_DECODE_r_struct_of_r_binaryANDr_array_of_r_float64END_TO_r_struct_of_r_stringANDr_array_of_r_float64END 2023-04-22 21:18:57.822 : INFO: instruction count: 58: __C4187Compiled.__m4292INPLACE_DECODE_r_array_of_r_float64_TO_r_array_of_r_float64 2023-04-22 21:18:57.822 : INFO: instruction count: 10: __C4187Compiled.__m4293INPLACE_DECODE_r_float64_TO_r_float64 2023-04-22 21:18:57.833 : INFO: RegionPool: REPORT_THRESHOLD: 2.0M allocated (1.5M blocks / 512.0K chunks), regions.size = 2, 0 current java objects, thread 14: Thread-5 2023-04-22 21:18:58.198 LAPACK: WARN: Failed to load implementation from: com.github.fommil.netlib.NativeSystemLAPACK 2023-04-22 21:18:58.345 LAPACK: WARN: Failed to load implementation from: com.github.fommil.netlib.NativeRefLAPACK 2023-04-22 21:18:58.450 MemoryStore: INFO: Block broadcast_203 stored as values in memory (estimated size 352.1 KiB, free 25.1 GiB) 2023-04-22 21:18:58.454 MemoryStore: INFO: Block broadcast_203_piece0 stored as bytes in memory (estimated size 343.8 KiB, free 25.1 GiB) 2023-04-22 21:18:58.454 BlockManagerInfo: INFO: Added broadcast_203_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 343.8 KiB, free: 25.3 GiB) 2023-04-22 21:18:58.455 SparkContext: INFO: Created broadcast 203 from broadcast at SparkBackend.scala:354 2023-04-22 21:18:58.455 MemoryStore: INFO: Block broadcast_204 stored as values in memory (estimated size 4.8 KiB, free 25.1 GiB) 2023-04-22 21:18:58.457 MemoryStore: INFO: Block broadcast_204_piece0 stored as bytes in memory (estimated size 4.7 KiB, free 25.1 GiB) 2023-04-22 21:18:58.457 BlockManagerInfo: INFO: Added broadcast_204_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 4.7 KiB, free: 25.3 GiB) 2023-04-22 21:18:58.457 SparkContext: INFO: Created broadcast 204 from broadcast at SparkBackend.scala:354 2023-04-22 21:18:58.724 SparkContext: INFO: Starting job: collect at ContextRDD.scala:176 2023-04-22 21:18:58.726 DAGScheduler: INFO: Got job 47 (collect at ContextRDD.scala:176) with 28 output partitions 2023-04-22 21:18:58.726 DAGScheduler: INFO: Final stage: ResultStage 86 (collect at ContextRDD.scala:176) 2023-04-22 21:18:58.726 DAGScheduler: INFO: Parents of final stage: List() 2023-04-22 21:18:58.727 DAGScheduler: INFO: Missing parents: List() 2023-04-22 21:18:58.727 DAGScheduler: INFO: Submitting ResultStage 86 (MapPartitionsRDD[195] at mapPartitions at ContextRDD.scala:168), which has no missing parents 2023-04-22 21:18:58.742 MemoryStore: INFO: Block broadcast_205 stored as values in memory (estimated size 111.3 KiB, free 25.1 GiB) 2023-04-22 21:18:58.744 MemoryStore: INFO: Block broadcast_205_piece0 stored as bytes in memory (estimated size 42.6 KiB, free 25.1 GiB) 2023-04-22 21:18:58.744 BlockManagerInfo: INFO: Added broadcast_205_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 42.6 KiB, free: 25.3 GiB) 2023-04-22 21:18:58.745 SparkContext: INFO: Created broadcast 205 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:18:58.745 DAGScheduler: INFO: Submitting 28 missing tasks from ResultStage 86 (MapPartitionsRDD[195] at mapPartitions at ContextRDD.scala:168) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)) 2023-04-22 21:18:58.745 TaskSchedulerImpl: INFO: Adding task set 86.0 with 28 tasks resource profile 0 2023-04-22 21:18:58.746 TaskSetManager: INFO: Starting task 0.0 in stage 86.0 (TID 504) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:18:58.746 Executor: INFO: Running task 0.0 in stage 86.0 (TID 504) 2023-04-22 21:18:59.267 MemoryStore: INFO: Block rdd_188_0 stored as values in memory (estimated size 128.0 MiB, free 24.9 GiB) 2023-04-22 21:18:59.267 BlockManagerInfo: INFO: Added rdd_188_0 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 25.1 GiB) 2023-04-22 21:18:59.555 MemoryStore: INFO: Block rdd_188_28 stored as values in memory (estimated size 1760.2 KiB, free 24.9 GiB) 2023-04-22 21:18:59.555 BlockManagerInfo: INFO: Added rdd_188_28 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 25.1 GiB) 2023-04-22 21:18:59.561 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 0.0 in stage 86.0 (TID 504) 2023-04-22 21:18:59.641 : INFO: TaskReport: stage=86, partition=0, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:18:59.641 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 0.0 in stage 86.0 (TID 504) 2023-04-22 21:18:59.642 Executor: INFO: Finished task 0.0 in stage 86.0 (TID 504). 923 bytes result sent to driver 2023-04-22 21:18:59.642 TaskSetManager: INFO: Starting task 1.0 in stage 86.0 (TID 505) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:18:59.644 TaskSetManager: INFO: Finished task 0.0 in stage 86.0 (TID 504) in 898 ms on uger-c010.broadinstitute.org (executor driver) (1/28) 2023-04-22 21:18:59.645 Executor: INFO: Running task 1.0 in stage 86.0 (TID 505) 2023-04-22 21:18:59.942 MemoryStore: INFO: Block rdd_188_1 stored as values in memory (estimated size 128.0 MiB, free 24.8 GiB) 2023-04-22 21:18:59.942 BlockManagerInfo: INFO: Added rdd_188_1 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 25.0 GiB) 2023-04-22 21:19:00.231 MemoryStore: INFO: Block rdd_188_29 stored as values in memory (estimated size 1760.2 KiB, free 24.8 GiB) 2023-04-22 21:19:00.231 BlockManagerInfo: INFO: Added rdd_188_29 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 25.0 GiB) 2023-04-22 21:19:00.236 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 1.0 in stage 86.0 (TID 505) 2023-04-22 21:19:00.252 : INFO: TaskReport: stage=86, partition=1, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:00.253 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 1.0 in stage 86.0 (TID 505) 2023-04-22 21:19:00.253 Executor: INFO: Finished task 1.0 in stage 86.0 (TID 505). 923 bytes result sent to driver 2023-04-22 21:19:00.253 TaskSetManager: INFO: Starting task 2.0 in stage 86.0 (TID 506) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:00.254 TaskSetManager: INFO: Finished task 1.0 in stage 86.0 (TID 505) in 612 ms on uger-c010.broadinstitute.org (executor driver) (2/28) 2023-04-22 21:19:00.257 Executor: INFO: Running task 2.0 in stage 86.0 (TID 506) 2023-04-22 21:19:00.509 MemoryStore: INFO: Block rdd_188_2 stored as values in memory (estimated size 128.0 MiB, free 24.7 GiB) 2023-04-22 21:19:00.509 BlockManagerInfo: INFO: Added rdd_188_2 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 24.9 GiB) 2023-04-22 21:19:00.807 MemoryStore: INFO: Block rdd_188_30 stored as values in memory (estimated size 1760.2 KiB, free 24.7 GiB) 2023-04-22 21:19:00.807 BlockManagerInfo: INFO: Added rdd_188_30 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 24.9 GiB) 2023-04-22 21:19:00.812 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 2.0 in stage 86.0 (TID 506) 2023-04-22 21:19:00.828 : INFO: TaskReport: stage=86, partition=2, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:00.828 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 2.0 in stage 86.0 (TID 506) 2023-04-22 21:19:00.828 Executor: INFO: Finished task 2.0 in stage 86.0 (TID 506). 923 bytes result sent to driver 2023-04-22 21:19:00.828 TaskSetManager: INFO: Starting task 3.0 in stage 86.0 (TID 507) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:00.829 TaskSetManager: INFO: Finished task 2.0 in stage 86.0 (TID 506) in 576 ms on uger-c010.broadinstitute.org (executor driver) (3/28) 2023-04-22 21:19:00.832 Executor: INFO: Running task 3.0 in stage 86.0 (TID 507) 2023-04-22 21:19:01.080 MemoryStore: INFO: Block rdd_188_3 stored as values in memory (estimated size 128.0 MiB, free 24.6 GiB) 2023-04-22 21:19:01.080 BlockManagerInfo: INFO: Added rdd_188_3 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 24.8 GiB) 2023-04-22 21:19:01.369 MemoryStore: INFO: Block rdd_188_31 stored as values in memory (estimated size 1760.2 KiB, free 24.6 GiB) 2023-04-22 21:19:01.374 BlockManagerInfo: INFO: Added rdd_188_31 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 24.8 GiB) 2023-04-22 21:19:01.386 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 3.0 in stage 86.0 (TID 507) 2023-04-22 21:19:01.404 : INFO: TaskReport: stage=86, partition=3, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:01.404 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 3.0 in stage 86.0 (TID 507) 2023-04-22 21:19:01.405 Executor: INFO: Finished task 3.0 in stage 86.0 (TID 507). 923 bytes result sent to driver 2023-04-22 21:19:01.405 TaskSetManager: INFO: Starting task 4.0 in stage 86.0 (TID 508) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:01.405 TaskSetManager: INFO: Finished task 3.0 in stage 86.0 (TID 507) in 577 ms on uger-c010.broadinstitute.org (executor driver) (4/28) 2023-04-22 21:19:01.406 Executor: INFO: Running task 4.0 in stage 86.0 (TID 508) 2023-04-22 21:19:01.669 MemoryStore: INFO: Block rdd_188_4 stored as values in memory (estimated size 128.0 MiB, free 24.4 GiB) 2023-04-22 21:19:01.669 BlockManagerInfo: INFO: Added rdd_188_4 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 24.6 GiB) 2023-04-22 21:19:01.961 MemoryStore: INFO: Block rdd_188_32 stored as values in memory (estimated size 1760.2 KiB, free 24.4 GiB) 2023-04-22 21:19:01.961 BlockManagerInfo: INFO: Added rdd_188_32 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 24.6 GiB) 2023-04-22 21:19:01.965 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 4.0 in stage 86.0 (TID 508) 2023-04-22 21:19:01.987 : INFO: TaskReport: stage=86, partition=4, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:01.987 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 4.0 in stage 86.0 (TID 508) 2023-04-22 21:19:01.988 Executor: INFO: Finished task 4.0 in stage 86.0 (TID 508). 923 bytes result sent to driver 2023-04-22 21:19:01.988 TaskSetManager: INFO: Starting task 5.0 in stage 86.0 (TID 509) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:01.988 TaskSetManager: INFO: Finished task 4.0 in stage 86.0 (TID 508) in 583 ms on uger-c010.broadinstitute.org (executor driver) (5/28) 2023-04-22 21:19:01.989 Executor: INFO: Running task 5.0 in stage 86.0 (TID 509) 2023-04-22 21:19:02.249 MemoryStore: INFO: Block rdd_188_5 stored as values in memory (estimated size 128.0 MiB, free 24.3 GiB) 2023-04-22 21:19:02.251 BlockManagerInfo: INFO: Added rdd_188_5 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 24.5 GiB) 2023-04-22 21:19:02.547 MemoryStore: INFO: Block rdd_188_33 stored as values in memory (estimated size 1760.2 KiB, free 24.3 GiB) 2023-04-22 21:19:02.549 BlockManagerInfo: INFO: Added rdd_188_33 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 24.5 GiB) 2023-04-22 21:19:02.553 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 5.0 in stage 86.0 (TID 509) 2023-04-22 21:19:02.570 : INFO: TaskReport: stage=86, partition=5, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:02.570 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 5.0 in stage 86.0 (TID 509) 2023-04-22 21:19:02.570 Executor: INFO: Finished task 5.0 in stage 86.0 (TID 509). 923 bytes result sent to driver 2023-04-22 21:19:02.571 TaskSetManager: INFO: Starting task 6.0 in stage 86.0 (TID 510) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:02.571 TaskSetManager: INFO: Finished task 5.0 in stage 86.0 (TID 509) in 583 ms on uger-c010.broadinstitute.org (executor driver) (6/28) 2023-04-22 21:19:02.573 Executor: INFO: Running task 6.0 in stage 86.0 (TID 510) 2023-04-22 21:19:02.845 MemoryStore: INFO: Block rdd_188_6 stored as values in memory (estimated size 128.0 MiB, free 24.2 GiB) 2023-04-22 21:19:02.845 BlockManagerInfo: INFO: Added rdd_188_6 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 24.4 GiB) 2023-04-22 21:19:03.135 MemoryStore: INFO: Block rdd_188_34 stored as values in memory (estimated size 1760.2 KiB, free 24.2 GiB) 2023-04-22 21:19:03.135 BlockManagerInfo: INFO: Added rdd_188_34 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 24.4 GiB) 2023-04-22 21:19:03.139 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 6.0 in stage 86.0 (TID 510) 2023-04-22 21:19:03.157 : INFO: TaskReport: stage=86, partition=6, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:03.157 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 6.0 in stage 86.0 (TID 510) 2023-04-22 21:19:03.157 Executor: INFO: Finished task 6.0 in stage 86.0 (TID 510). 923 bytes result sent to driver 2023-04-22 21:19:03.157 TaskSetManager: INFO: Starting task 7.0 in stage 86.0 (TID 511) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:03.158 TaskSetManager: INFO: Finished task 6.0 in stage 86.0 (TID 510) in 587 ms on uger-c010.broadinstitute.org (executor driver) (7/28) 2023-04-22 21:19:03.162 Executor: INFO: Running task 7.0 in stage 86.0 (TID 511) 2023-04-22 21:19:03.428 MemoryStore: INFO: Block rdd_188_7 stored as values in memory (estimated size 128.0 MiB, free 24.1 GiB) 2023-04-22 21:19:03.428 BlockManagerInfo: INFO: Added rdd_188_7 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 24.3 GiB) 2023-04-22 21:19:03.721 MemoryStore: INFO: Block rdd_188_35 stored as values in memory (estimated size 1760.2 KiB, free 24.1 GiB) 2023-04-22 21:19:03.721 BlockManagerInfo: INFO: Added rdd_188_35 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 24.3 GiB) 2023-04-22 21:19:03.725 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 7.0 in stage 86.0 (TID 511) 2023-04-22 21:19:03.740 : INFO: TaskReport: stage=86, partition=7, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:03.740 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 7.0 in stage 86.0 (TID 511) 2023-04-22 21:19:03.741 Executor: INFO: Finished task 7.0 in stage 86.0 (TID 511). 923 bytes result sent to driver 2023-04-22 21:19:03.741 TaskSetManager: INFO: Starting task 8.0 in stage 86.0 (TID 512) (uger-c010.broadinstitute.org, executor driver, partition 8, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:03.746 TaskSetManager: INFO: Finished task 7.0 in stage 86.0 (TID 511) in 588 ms on uger-c010.broadinstitute.org (executor driver) (8/28) 2023-04-22 21:19:03.746 Executor: INFO: Running task 8.0 in stage 86.0 (TID 512) 2023-04-22 21:19:06.969 BlockManagerInfo: INFO: Removed broadcast_195_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 393.7 KiB, free: 24.3 GiB) 2023-04-22 21:19:07.028 BlockManagerInfo: INFO: Removed broadcast_199_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 2.3 MiB, free: 24.3 GiB) 2023-04-22 21:19:07.067 BlockManagerInfo: INFO: Removed broadcast_200_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 368.6 KiB, free: 24.3 GiB) 2023-04-22 21:19:07.118 BlockManagerInfo: INFO: Removed broadcast_198_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 162.0 B, free: 24.3 GiB) 2023-04-22 21:19:07.165 BlockManagerInfo: INFO: Removed broadcast_191_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 155.0 B, free: 24.3 GiB) 2023-04-22 21:19:07.204 BlockManagerInfo: INFO: Removed broadcast_202_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 368.6 KiB, free: 24.3 GiB) 2023-04-22 21:19:07.206 MemoryStore: INFO: Block rdd_188_8 stored as values in memory (estimated size 128.0 MiB, free 24.0 GiB) 2023-04-22 21:19:07.206 BlockManagerInfo: INFO: Added rdd_188_8 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 24.1 GiB) 2023-04-22 21:19:07.255 BlockManagerInfo: INFO: Removed broadcast_193_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 49.0 B, free: 24.1 GiB) 2023-04-22 21:19:07.316 BlockManagerInfo: INFO: Removed broadcast_194_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 32.4 KiB, free: 24.1 GiB) 2023-04-22 21:19:07.366 BlockManagerInfo: INFO: Removed broadcast_196_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 162.0 B, free: 24.1 GiB) 2023-04-22 21:19:07.405 BlockManagerInfo: INFO: Removed broadcast_192_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 719.0 B, free: 24.1 GiB) 2023-04-22 21:19:07.519 BlockManagerInfo: INFO: Removed broadcast_24_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 65.3 KiB, free: 24.1 GiB) 2023-04-22 21:19:07.617 BlockManagerInfo: INFO: Removed broadcast_27_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 97.4 KiB, free: 24.1 GiB) 2023-04-22 21:19:07.668 MemoryStore: INFO: Block rdd_188_36 stored as values in memory (estimated size 1760.2 KiB, free 24.0 GiB) 2023-04-22 21:19:07.668 BlockManagerInfo: INFO: Added rdd_188_36 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 24.1 GiB) 2023-04-22 21:19:07.672 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 8.0 in stage 86.0 (TID 512) 2023-04-22 21:19:07.673 BlockManagerInfo: INFO: Removed broadcast_21_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 114.0 B, free: 24.1 GiB) 2023-04-22 21:19:07.732 BlockManagerInfo: INFO: Removed broadcast_25_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 421.2 KiB, free: 24.1 GiB) 2023-04-22 21:19:07.742 BlockManagerInfo: INFO: Removed broadcast_17_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 113.0 B, free: 24.1 GiB) 2023-04-22 21:19:07.744 BlockManagerInfo: INFO: Removed broadcast_13_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 2.3 MiB, free: 24.1 GiB) 2023-04-22 21:19:07.745 BlockManagerInfo: INFO: Removed broadcast_23_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 70.0 B, free: 24.1 GiB) 2023-04-22 21:19:07.746 : INFO: TaskReport: stage=86, partition=8, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:07.746 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 8.0 in stage 86.0 (TID 512) 2023-04-22 21:19:07.746 Executor: INFO: Finished task 8.0 in stage 86.0 (TID 512). 966 bytes result sent to driver 2023-04-22 21:19:07.747 TaskSetManager: INFO: Starting task 9.0 in stage 86.0 (TID 513) (uger-c010.broadinstitute.org, executor driver, partition 9, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:07.747 TaskSetManager: INFO: Finished task 8.0 in stage 86.0 (TID 512) in 4006 ms on uger-c010.broadinstitute.org (executor driver) (9/28) 2023-04-22 21:19:07.749 Executor: INFO: Running task 9.0 in stage 86.0 (TID 513) 2023-04-22 21:19:07.767 BlockManagerInfo: INFO: Removed broadcast_18_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 14.2 KiB, free: 24.1 GiB) 2023-04-22 21:19:07.866 BlockManager: INFO: Removing RDD 12 2023-04-22 21:19:08.040 MemoryStore: INFO: Block rdd_188_9 stored as values in memory (estimated size 128.0 MiB, free 27.5 GiB) 2023-04-22 21:19:08.040 BlockManagerInfo: INFO: Added rdd_188_9 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 27.6 GiB) 2023-04-22 21:19:08.327 MemoryStore: INFO: Block rdd_188_37 stored as values in memory (estimated size 1760.2 KiB, free 27.5 GiB) 2023-04-22 21:19:08.327 BlockManagerInfo: INFO: Added rdd_188_37 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 27.6 GiB) 2023-04-22 21:19:08.331 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 9.0 in stage 86.0 (TID 513) 2023-04-22 21:19:08.350 : INFO: TaskReport: stage=86, partition=9, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:08.350 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 9.0 in stage 86.0 (TID 513) 2023-04-22 21:19:08.351 Executor: INFO: Finished task 9.0 in stage 86.0 (TID 513). 923 bytes result sent to driver 2023-04-22 21:19:08.351 TaskSetManager: INFO: Starting task 10.0 in stage 86.0 (TID 514) (uger-c010.broadinstitute.org, executor driver, partition 10, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:08.351 TaskSetManager: INFO: Finished task 9.0 in stage 86.0 (TID 513) in 605 ms on uger-c010.broadinstitute.org (executor driver) (10/28) 2023-04-22 21:19:08.352 Executor: INFO: Running task 10.0 in stage 86.0 (TID 514) 2023-04-22 21:19:08.612 MemoryStore: INFO: Block rdd_188_10 stored as values in memory (estimated size 128.0 MiB, free 27.3 GiB) 2023-04-22 21:19:08.614 BlockManagerInfo: INFO: Added rdd_188_10 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 27.4 GiB) 2023-04-22 21:19:08.905 MemoryStore: INFO: Block rdd_188_38 stored as values in memory (estimated size 1760.2 KiB, free 27.3 GiB) 2023-04-22 21:19:08.905 BlockManagerInfo: INFO: Added rdd_188_38 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 27.4 GiB) 2023-04-22 21:19:08.911 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 10.0 in stage 86.0 (TID 514) 2023-04-22 21:19:08.927 : INFO: TaskReport: stage=86, partition=10, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:08.927 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 10.0 in stage 86.0 (TID 514) 2023-04-22 21:19:08.927 Executor: INFO: Finished task 10.0 in stage 86.0 (TID 514). 924 bytes result sent to driver 2023-04-22 21:19:08.928 TaskSetManager: INFO: Starting task 11.0 in stage 86.0 (TID 515) (uger-c010.broadinstitute.org, executor driver, partition 11, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:08.928 TaskSetManager: INFO: Finished task 10.0 in stage 86.0 (TID 514) in 577 ms on uger-c010.broadinstitute.org (executor driver) (11/28) 2023-04-22 21:19:08.928 Executor: INFO: Running task 11.0 in stage 86.0 (TID 515) 2023-04-22 21:19:09.178 MemoryStore: INFO: Block rdd_188_11 stored as values in memory (estimated size 128.0 MiB, free 27.2 GiB) 2023-04-22 21:19:09.179 BlockManagerInfo: INFO: Added rdd_188_11 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 27.3 GiB) 2023-04-22 21:19:09.480 MemoryStore: INFO: Block rdd_188_39 stored as values in memory (estimated size 1760.2 KiB, free 27.2 GiB) 2023-04-22 21:19:09.480 BlockManagerInfo: INFO: Added rdd_188_39 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 27.3 GiB) 2023-04-22 21:19:09.485 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 11.0 in stage 86.0 (TID 515) 2023-04-22 21:19:09.506 : INFO: TaskReport: stage=86, partition=11, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:09.506 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 11.0 in stage 86.0 (TID 515) 2023-04-22 21:19:09.506 Executor: INFO: Finished task 11.0 in stage 86.0 (TID 515). 924 bytes result sent to driver 2023-04-22 21:19:09.506 TaskSetManager: INFO: Starting task 12.0 in stage 86.0 (TID 516) (uger-c010.broadinstitute.org, executor driver, partition 12, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:09.507 TaskSetManager: INFO: Finished task 11.0 in stage 86.0 (TID 515) in 579 ms on uger-c010.broadinstitute.org (executor driver) (12/28) 2023-04-22 21:19:09.507 Executor: INFO: Running task 12.0 in stage 86.0 (TID 516) 2023-04-22 21:19:09.779 MemoryStore: INFO: Block rdd_188_12 stored as values in memory (estimated size 128.0 MiB, free 27.1 GiB) 2023-04-22 21:19:09.779 BlockManagerInfo: INFO: Added rdd_188_12 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 27.2 GiB) 2023-04-22 21:19:10.065 MemoryStore: INFO: Block rdd_188_40 stored as values in memory (estimated size 1760.2 KiB, free 27.1 GiB) 2023-04-22 21:19:10.066 BlockManagerInfo: INFO: Added rdd_188_40 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 27.2 GiB) 2023-04-22 21:19:10.070 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 12.0 in stage 86.0 (TID 516) 2023-04-22 21:19:10.085 : INFO: TaskReport: stage=86, partition=12, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:10.085 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 12.0 in stage 86.0 (TID 516) 2023-04-22 21:19:10.086 Executor: INFO: Finished task 12.0 in stage 86.0 (TID 516). 924 bytes result sent to driver 2023-04-22 21:19:10.086 TaskSetManager: INFO: Starting task 13.0 in stage 86.0 (TID 517) (uger-c010.broadinstitute.org, executor driver, partition 13, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:10.086 TaskSetManager: INFO: Finished task 12.0 in stage 86.0 (TID 516) in 580 ms on uger-c010.broadinstitute.org (executor driver) (13/28) 2023-04-22 21:19:10.087 Executor: INFO: Running task 13.0 in stage 86.0 (TID 517) 2023-04-22 21:19:10.309 MemoryStore: INFO: Block rdd_188_13 stored as values in memory (estimated size 128.0 MiB, free 27.0 GiB) 2023-04-22 21:19:10.310 BlockManagerInfo: INFO: Added rdd_188_13 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 27.0 GiB) 2023-04-22 21:19:10.613 MemoryStore: INFO: Block rdd_188_41 stored as values in memory (estimated size 1760.2 KiB, free 27.0 GiB) 2023-04-22 21:19:10.613 BlockManagerInfo: INFO: Added rdd_188_41 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 27.0 GiB) 2023-04-22 21:19:10.617 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 13.0 in stage 86.0 (TID 517) 2023-04-22 21:19:10.634 : INFO: TaskReport: stage=86, partition=13, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:10.634 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 13.0 in stage 86.0 (TID 517) 2023-04-22 21:19:10.634 Executor: INFO: Finished task 13.0 in stage 86.0 (TID 517). 924 bytes result sent to driver 2023-04-22 21:19:10.634 TaskSetManager: INFO: Starting task 14.0 in stage 86.0 (TID 518) (uger-c010.broadinstitute.org, executor driver, partition 14, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:10.635 TaskSetManager: INFO: Finished task 13.0 in stage 86.0 (TID 517) in 549 ms on uger-c010.broadinstitute.org (executor driver) (14/28) 2023-04-22 21:19:10.635 Executor: INFO: Running task 14.0 in stage 86.0 (TID 518) 2023-04-22 21:19:10.896 MemoryStore: INFO: Block rdd_188_14 stored as values in memory (estimated size 128.0 MiB, free 26.8 GiB) 2023-04-22 21:19:10.897 BlockManagerInfo: INFO: Added rdd_188_14 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 26.9 GiB) 2023-04-22 21:19:11.182 MemoryStore: INFO: Block rdd_188_42 stored as values in memory (estimated size 1760.2 KiB, free 26.8 GiB) 2023-04-22 21:19:11.182 BlockManagerInfo: INFO: Added rdd_188_42 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 26.9 GiB) 2023-04-22 21:19:11.186 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 14.0 in stage 86.0 (TID 518) 2023-04-22 21:19:11.202 : INFO: TaskReport: stage=86, partition=14, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:11.202 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 86.0 (TID 518) 2023-04-22 21:19:11.203 Executor: INFO: Finished task 14.0 in stage 86.0 (TID 518). 924 bytes result sent to driver 2023-04-22 21:19:11.203 TaskSetManager: INFO: Starting task 15.0 in stage 86.0 (TID 519) (uger-c010.broadinstitute.org, executor driver, partition 15, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:11.203 TaskSetManager: INFO: Finished task 14.0 in stage 86.0 (TID 518) in 569 ms on uger-c010.broadinstitute.org (executor driver) (15/28) 2023-04-22 21:19:11.203 Executor: INFO: Running task 15.0 in stage 86.0 (TID 519) 2023-04-22 21:19:11.447 MemoryStore: INFO: Block rdd_188_15 stored as values in memory (estimated size 128.0 MiB, free 26.7 GiB) 2023-04-22 21:19:11.447 BlockManagerInfo: INFO: Added rdd_188_15 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 26.8 GiB) 2023-04-22 21:19:11.737 MemoryStore: INFO: Block rdd_188_43 stored as values in memory (estimated size 1760.2 KiB, free 26.7 GiB) 2023-04-22 21:19:11.737 BlockManagerInfo: INFO: Added rdd_188_43 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 26.8 GiB) 2023-04-22 21:19:11.742 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 15.0 in stage 86.0 (TID 519) 2023-04-22 21:19:11.756 : INFO: TaskReport: stage=86, partition=15, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:11.756 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 15.0 in stage 86.0 (TID 519) 2023-04-22 21:19:11.757 Executor: INFO: Finished task 15.0 in stage 86.0 (TID 519). 924 bytes result sent to driver 2023-04-22 21:19:11.757 TaskSetManager: INFO: Starting task 16.0 in stage 86.0 (TID 520) (uger-c010.broadinstitute.org, executor driver, partition 16, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:11.757 TaskSetManager: INFO: Finished task 15.0 in stage 86.0 (TID 519) in 554 ms on uger-c010.broadinstitute.org (executor driver) (16/28) 2023-04-22 21:19:11.757 Executor: INFO: Running task 16.0 in stage 86.0 (TID 520) 2023-04-22 21:19:12.014 MemoryStore: INFO: Block rdd_188_16 stored as values in memory (estimated size 128.0 MiB, free 26.6 GiB) 2023-04-22 21:19:12.014 BlockManagerInfo: INFO: Added rdd_188_16 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 26.7 GiB) 2023-04-22 21:19:12.305 MemoryStore: INFO: Block rdd_188_44 stored as values in memory (estimated size 1760.2 KiB, free 26.6 GiB) 2023-04-22 21:19:12.305 BlockManagerInfo: INFO: Added rdd_188_44 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 26.7 GiB) 2023-04-22 21:19:12.310 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 16.0 in stage 86.0 (TID 520) 2023-04-22 21:19:12.340 : INFO: TaskReport: stage=86, partition=16, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:12.340 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 86.0 (TID 520) 2023-04-22 21:19:12.341 Executor: INFO: Finished task 16.0 in stage 86.0 (TID 520). 924 bytes result sent to driver 2023-04-22 21:19:12.341 TaskSetManager: INFO: Starting task 17.0 in stage 86.0 (TID 521) (uger-c010.broadinstitute.org, executor driver, partition 17, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:12.341 TaskSetManager: INFO: Finished task 16.0 in stage 86.0 (TID 520) in 584 ms on uger-c010.broadinstitute.org (executor driver) (17/28) 2023-04-22 21:19:12.341 Executor: INFO: Running task 17.0 in stage 86.0 (TID 521) 2023-04-22 21:19:12.598 MemoryStore: INFO: Block rdd_188_17 stored as values in memory (estimated size 128.0 MiB, free 26.4 GiB) 2023-04-22 21:19:12.598 BlockManagerInfo: INFO: Added rdd_188_17 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 26.5 GiB) 2023-04-22 21:19:12.884 MemoryStore: INFO: Block rdd_188_45 stored as values in memory (estimated size 1760.2 KiB, free 26.4 GiB) 2023-04-22 21:19:12.884 BlockManagerInfo: INFO: Added rdd_188_45 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 26.5 GiB) 2023-04-22 21:19:12.888 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 17.0 in stage 86.0 (TID 521) 2023-04-22 21:19:12.903 : INFO: TaskReport: stage=86, partition=17, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:12.903 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 17.0 in stage 86.0 (TID 521) 2023-04-22 21:19:12.903 Executor: INFO: Finished task 17.0 in stage 86.0 (TID 521). 924 bytes result sent to driver 2023-04-22 21:19:12.904 TaskSetManager: INFO: Starting task 18.0 in stage 86.0 (TID 522) (uger-c010.broadinstitute.org, executor driver, partition 18, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:12.904 TaskSetManager: INFO: Finished task 17.0 in stage 86.0 (TID 521) in 563 ms on uger-c010.broadinstitute.org (executor driver) (18/28) 2023-04-22 21:19:12.905 Executor: INFO: Running task 18.0 in stage 86.0 (TID 522) 2023-04-22 21:19:13.162 MemoryStore: INFO: Block rdd_188_18 stored as values in memory (estimated size 128.0 MiB, free 26.3 GiB) 2023-04-22 21:19:13.162 BlockManagerInfo: INFO: Added rdd_188_18 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 26.4 GiB) 2023-04-22 21:19:13.452 MemoryStore: INFO: Block rdd_188_46 stored as values in memory (estimated size 1760.2 KiB, free 26.3 GiB) 2023-04-22 21:19:13.452 BlockManagerInfo: INFO: Added rdd_188_46 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 26.4 GiB) 2023-04-22 21:19:13.456 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 18.0 in stage 86.0 (TID 522) 2023-04-22 21:19:13.472 : INFO: TaskReport: stage=86, partition=18, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:13.472 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 86.0 (TID 522) 2023-04-22 21:19:13.472 Executor: INFO: Finished task 18.0 in stage 86.0 (TID 522). 924 bytes result sent to driver 2023-04-22 21:19:13.472 TaskSetManager: INFO: Starting task 19.0 in stage 86.0 (TID 523) (uger-c010.broadinstitute.org, executor driver, partition 19, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:13.473 TaskSetManager: INFO: Finished task 18.0 in stage 86.0 (TID 522) in 570 ms on uger-c010.broadinstitute.org (executor driver) (19/28) 2023-04-22 21:19:13.474 Executor: INFO: Running task 19.0 in stage 86.0 (TID 523) 2023-04-22 21:19:13.733 MemoryStore: INFO: Block rdd_188_19 stored as values in memory (estimated size 128.0 MiB, free 26.2 GiB) 2023-04-22 21:19:13.733 BlockManagerInfo: INFO: Added rdd_188_19 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 26.3 GiB) 2023-04-22 21:19:14.287 MemoryStore: INFO: Block rdd_188_47 stored as values in memory (estimated size 1760.2 KiB, free 26.2 GiB) 2023-04-22 21:19:14.287 BlockManagerInfo: INFO: Added rdd_188_47 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 26.3 GiB) 2023-04-22 21:19:14.291 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 19.0 in stage 86.0 (TID 523) 2023-04-22 21:19:14.308 : INFO: TaskReport: stage=86, partition=19, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:14.308 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 19.0 in stage 86.0 (TID 523) 2023-04-22 21:19:14.309 Executor: INFO: Finished task 19.0 in stage 86.0 (TID 523). 924 bytes result sent to driver 2023-04-22 21:19:14.309 TaskSetManager: INFO: Starting task 20.0 in stage 86.0 (TID 524) (uger-c010.broadinstitute.org, executor driver, partition 20, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:14.309 TaskSetManager: INFO: Finished task 19.0 in stage 86.0 (TID 523) in 837 ms on uger-c010.broadinstitute.org (executor driver) (20/28) 2023-04-22 21:19:14.309 Executor: INFO: Running task 20.0 in stage 86.0 (TID 524) 2023-04-22 21:19:14.582 MemoryStore: INFO: Block rdd_188_20 stored as values in memory (estimated size 128.0 MiB, free 26.1 GiB) 2023-04-22 21:19:14.582 BlockManagerInfo: INFO: Added rdd_188_20 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 26.2 GiB) 2023-04-22 21:19:14.868 MemoryStore: INFO: Block rdd_188_48 stored as values in memory (estimated size 1760.2 KiB, free 26.1 GiB) 2023-04-22 21:19:14.871 BlockManagerInfo: INFO: Added rdd_188_48 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 26.2 GiB) 2023-04-22 21:19:14.876 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 20.0 in stage 86.0 (TID 524) 2023-04-22 21:19:14.894 : INFO: TaskReport: stage=86, partition=20, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:14.894 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 86.0 (TID 524) 2023-04-22 21:19:14.894 Executor: INFO: Finished task 20.0 in stage 86.0 (TID 524). 924 bytes result sent to driver 2023-04-22 21:19:14.895 TaskSetManager: INFO: Starting task 21.0 in stage 86.0 (TID 525) (uger-c010.broadinstitute.org, executor driver, partition 21, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:14.896 TaskSetManager: INFO: Finished task 20.0 in stage 86.0 (TID 524) in 586 ms on uger-c010.broadinstitute.org (executor driver) (21/28) 2023-04-22 21:19:14.896 Executor: INFO: Running task 21.0 in stage 86.0 (TID 525) 2023-04-22 21:19:15.168 MemoryStore: INFO: Block rdd_188_21 stored as values in memory (estimated size 128.0 MiB, free 25.9 GiB) 2023-04-22 21:19:15.170 BlockManagerInfo: INFO: Added rdd_188_21 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 26.0 GiB) 2023-04-22 21:19:15.638 MemoryStore: INFO: Block rdd_188_49 stored as values in memory (estimated size 1760.2 KiB, free 25.9 GiB) 2023-04-22 21:19:15.638 BlockManagerInfo: INFO: Added rdd_188_49 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 26.0 GiB) 2023-04-22 21:19:15.643 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 21.0 in stage 86.0 (TID 525) 2023-04-22 21:19:15.659 : INFO: TaskReport: stage=86, partition=21, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:15.660 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 21.0 in stage 86.0 (TID 525) 2023-04-22 21:19:15.660 Executor: INFO: Finished task 21.0 in stage 86.0 (TID 525). 924 bytes result sent to driver 2023-04-22 21:19:15.661 TaskSetManager: INFO: Starting task 22.0 in stage 86.0 (TID 526) (uger-c010.broadinstitute.org, executor driver, partition 22, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:15.661 TaskSetManager: INFO: Finished task 21.0 in stage 86.0 (TID 525) in 767 ms on uger-c010.broadinstitute.org (executor driver) (22/28) 2023-04-22 21:19:15.661 Executor: INFO: Running task 22.0 in stage 86.0 (TID 526) 2023-04-22 21:19:15.923 MemoryStore: INFO: Block rdd_188_22 stored as values in memory (estimated size 128.0 MiB, free 25.8 GiB) 2023-04-22 21:19:15.923 BlockManagerInfo: INFO: Added rdd_188_22 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 25.9 GiB) 2023-04-22 21:19:16.220 MemoryStore: INFO: Block rdd_188_50 stored as values in memory (estimated size 1760.2 KiB, free 25.8 GiB) 2023-04-22 21:19:16.220 BlockManagerInfo: INFO: Added rdd_188_50 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 25.9 GiB) 2023-04-22 21:19:16.225 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 22.0 in stage 86.0 (TID 526) 2023-04-22 21:19:16.241 : INFO: TaskReport: stage=86, partition=22, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:16.241 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 86.0 (TID 526) 2023-04-22 21:19:16.242 Executor: INFO: Finished task 22.0 in stage 86.0 (TID 526). 924 bytes result sent to driver 2023-04-22 21:19:16.243 TaskSetManager: INFO: Starting task 23.0 in stage 86.0 (TID 527) (uger-c010.broadinstitute.org, executor driver, partition 23, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:16.243 TaskSetManager: INFO: Finished task 22.0 in stage 86.0 (TID 526) in 583 ms on uger-c010.broadinstitute.org (executor driver) (23/28) 2023-04-22 21:19:16.244 Executor: INFO: Running task 23.0 in stage 86.0 (TID 527) 2023-04-22 21:19:16.525 MemoryStore: INFO: Block rdd_188_23 stored as values in memory (estimated size 128.0 MiB, free 25.7 GiB) 2023-04-22 21:19:16.525 BlockManagerInfo: INFO: Added rdd_188_23 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 25.8 GiB) 2023-04-22 21:19:16.980 MemoryStore: INFO: Block rdd_188_51 stored as values in memory (estimated size 1760.2 KiB, free 25.7 GiB) 2023-04-22 21:19:16.980 BlockManagerInfo: INFO: Added rdd_188_51 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 25.8 GiB) 2023-04-22 21:19:16.985 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 23.0 in stage 86.0 (TID 527) 2023-04-22 21:19:17.034 : INFO: TaskReport: stage=86, partition=23, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:17.034 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 23.0 in stage 86.0 (TID 527) 2023-04-22 21:19:17.034 Executor: INFO: Finished task 23.0 in stage 86.0 (TID 527). 924 bytes result sent to driver 2023-04-22 21:19:17.034 TaskSetManager: INFO: Starting task 24.0 in stage 86.0 (TID 528) (uger-c010.broadinstitute.org, executor driver, partition 24, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:17.035 TaskSetManager: INFO: Finished task 23.0 in stage 86.0 (TID 527) in 793 ms on uger-c010.broadinstitute.org (executor driver) (24/28) 2023-04-22 21:19:17.044 Executor: INFO: Running task 24.0 in stage 86.0 (TID 528) 2023-04-22 21:19:17.352 MemoryStore: INFO: Block rdd_188_24 stored as values in memory (estimated size 128.0 MiB, free 25.6 GiB) 2023-04-22 21:19:17.352 BlockManagerInfo: INFO: Added rdd_188_24 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 25.7 GiB) 2023-04-22 21:19:17.642 MemoryStore: INFO: Block rdd_188_52 stored as values in memory (estimated size 1760.2 KiB, free 25.6 GiB) 2023-04-22 21:19:17.642 BlockManagerInfo: INFO: Added rdd_188_52 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 25.7 GiB) 2023-04-22 21:19:17.646 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 24.0 in stage 86.0 (TID 528) 2023-04-22 21:19:17.663 : INFO: TaskReport: stage=86, partition=24, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:17.663 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 86.0 (TID 528) 2023-04-22 21:19:17.663 Executor: INFO: Finished task 24.0 in stage 86.0 (TID 528). 924 bytes result sent to driver 2023-04-22 21:19:17.664 TaskSetManager: INFO: Starting task 25.0 in stage 86.0 (TID 529) (uger-c010.broadinstitute.org, executor driver, partition 25, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:17.664 TaskSetManager: INFO: Finished task 24.0 in stage 86.0 (TID 528) in 630 ms on uger-c010.broadinstitute.org (executor driver) (25/28) 2023-04-22 21:19:17.665 Executor: INFO: Running task 25.0 in stage 86.0 (TID 529) 2023-04-22 21:19:17.932 MemoryStore: INFO: Block rdd_188_25 stored as values in memory (estimated size 128.0 MiB, free 25.4 GiB) 2023-04-22 21:19:17.932 BlockManagerInfo: INFO: Added rdd_188_25 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 25.5 GiB) 2023-04-22 21:19:18.479 MemoryStore: INFO: Block rdd_188_53 stored as values in memory (estimated size 1760.2 KiB, free 25.4 GiB) 2023-04-22 21:19:18.480 BlockManagerInfo: INFO: Added rdd_188_53 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 25.5 GiB) 2023-04-22 21:19:18.487 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 25.0 in stage 86.0 (TID 529) 2023-04-22 21:19:18.504 : INFO: TaskReport: stage=86, partition=25, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:18.505 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 25.0 in stage 86.0 (TID 529) 2023-04-22 21:19:18.505 Executor: INFO: Finished task 25.0 in stage 86.0 (TID 529). 924 bytes result sent to driver 2023-04-22 21:19:18.506 TaskSetManager: INFO: Starting task 26.0 in stage 86.0 (TID 530) (uger-c010.broadinstitute.org, executor driver, partition 26, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:18.506 TaskSetManager: INFO: Finished task 25.0 in stage 86.0 (TID 529) in 843 ms on uger-c010.broadinstitute.org (executor driver) (26/28) 2023-04-22 21:19:18.507 Executor: INFO: Running task 26.0 in stage 86.0 (TID 530) 2023-04-22 21:19:18.759 MemoryStore: INFO: Block rdd_188_26 stored as values in memory (estimated size 128.0 MiB, free 25.3 GiB) 2023-04-22 21:19:18.759 BlockManagerInfo: INFO: Added rdd_188_26 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 25.4 GiB) 2023-04-22 21:19:19.058 MemoryStore: INFO: Block rdd_188_54 stored as values in memory (estimated size 1760.2 KiB, free 25.3 GiB) 2023-04-22 21:19:19.058 BlockManagerInfo: INFO: Added rdd_188_54 in memory on uger-c010.broadinstitute.org:46121 (size: 1760.2 KiB, free: 25.4 GiB) 2023-04-22 21:19:19.063 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 26.0 in stage 86.0 (TID 530) 2023-04-22 21:19:19.080 : INFO: TaskReport: stage=86, partition=26, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:19.080 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 86.0 (TID 530) 2023-04-22 21:19:19.080 Executor: INFO: Finished task 26.0 in stage 86.0 (TID 530). 924 bytes result sent to driver 2023-04-22 21:19:19.081 TaskSetManager: INFO: Starting task 27.0 in stage 86.0 (TID 531) (uger-c010.broadinstitute.org, executor driver, partition 27, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:19.081 TaskSetManager: INFO: Finished task 26.0 in stage 86.0 (TID 530) in 576 ms on uger-c010.broadinstitute.org (executor driver) (27/28) 2023-04-22 21:19:19.082 Executor: INFO: Running task 27.0 in stage 86.0 (TID 531) 2023-04-22 21:19:19.316 MemoryStore: INFO: Block rdd_188_27 stored as values in memory (estimated size 125.0 MiB, free 25.2 GiB) 2023-04-22 21:19:19.316 BlockManagerInfo: INFO: Added rdd_188_27 in memory on uger-c010.broadinstitute.org:46121 (size: 125.0 MiB, free: 25.3 GiB) 2023-04-22 21:19:19.614 MemoryStore: INFO: Block rdd_188_55 stored as values in memory (estimated size 1718.5 KiB, free 25.2 GiB) 2023-04-22 21:19:19.614 BlockManagerInfo: INFO: Added rdd_188_55 in memory on uger-c010.broadinstitute.org:46121 (size: 1718.5 KiB, free: 25.3 GiB) 2023-04-22 21:19:19.619 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 27.0 in stage 86.0 (TID 531) 2023-04-22 21:19:19.637 : INFO: TaskReport: stage=86, partition=27, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:19.637 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 27.0 in stage 86.0 (TID 531) 2023-04-22 21:19:19.638 Executor: INFO: Finished task 27.0 in stage 86.0 (TID 531). 924 bytes result sent to driver 2023-04-22 21:19:19.638 TaskSetManager: INFO: Finished task 27.0 in stage 86.0 (TID 531) in 557 ms on uger-c010.broadinstitute.org (executor driver) (28/28) 2023-04-22 21:19:19.638 TaskSchedulerImpl: INFO: Removed TaskSet 86.0, whose tasks have all completed, from pool 2023-04-22 21:19:19.639 DAGScheduler: INFO: ResultStage 86 (collect at ContextRDD.scala:176) finished in 20.908 s 2023-04-22 21:19:19.639 DAGScheduler: INFO: Job 47 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:19:19.639 TaskSchedulerImpl: INFO: Killing all running tasks in stage 86: Stage finished 2023-04-22 21:19:19.639 DAGScheduler: INFO: Job 47 finished: collect at ContextRDD.scala:176, took 20.914900 s 2023-04-22 21:19:19.670 Hail: INFO: wrote matrix with 11 rows and 114591 columns as 28 blocks of size 4096 to /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/pcrelate-write-read-T5on4rcsdvcL5XkDO9pZGQ.bm 2023-04-22 21:19:19.674 MemoryStore: INFO: Block broadcast_206 stored as values in memory (estimated size 352.1 KiB, free 25.2 GiB) 2023-04-22 21:19:19.678 MemoryStore: INFO: Block broadcast_206_piece0 stored as bytes in memory (estimated size 320.5 KiB, free 25.2 GiB) 2023-04-22 21:19:19.678 BlockManagerInfo: INFO: Added broadcast_206_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 320.5 KiB, free: 25.3 GiB) 2023-04-22 21:19:19.679 SparkContext: INFO: Created broadcast 206 from broadcast at SparkBackend.scala:354 2023-04-22 21:19:19.679 MemoryStore: INFO: Block broadcast_207 stored as values in memory (estimated size 4.8 KiB, free 25.2 GiB) 2023-04-22 21:19:19.681 MemoryStore: INFO: Block broadcast_207_piece0 stored as bytes in memory (estimated size 4.4 KiB, free 25.2 GiB) 2023-04-22 21:19:19.681 BlockManagerInfo: INFO: Added broadcast_207_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 4.4 KiB, free: 25.3 GiB) 2023-04-22 21:19:19.682 SparkContext: INFO: Created broadcast 207 from broadcast at SparkBackend.scala:354 2023-04-22 21:19:21.556 SparkContext: INFO: Starting job: collect at ContextRDD.scala:176 2023-04-22 21:19:21.558 DAGScheduler: INFO: Got job 48 (collect at ContextRDD.scala:176) with 56 output partitions 2023-04-22 21:19:21.558 DAGScheduler: INFO: Final stage: ResultStage 87 (collect at ContextRDD.scala:176) 2023-04-22 21:19:21.558 DAGScheduler: INFO: Parents of final stage: List() 2023-04-22 21:19:21.558 DAGScheduler: INFO: Missing parents: List() 2023-04-22 21:19:21.558 DAGScheduler: INFO: Submitting ResultStage 87 (MapPartitionsRDD[203] at mapPartitions at ContextRDD.scala:168), which has no missing parents 2023-04-22 21:19:21.572 MemoryStore: INFO: Block broadcast_208 stored as values in memory (estimated size 109.2 KiB, free 25.2 GiB) 2023-04-22 21:19:21.574 MemoryStore: INFO: Block broadcast_208_piece0 stored as bytes in memory (estimated size 40.9 KiB, free 25.2 GiB) 2023-04-22 21:19:21.574 BlockManagerInfo: INFO: Added broadcast_208_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 40.9 KiB, free: 25.3 GiB) 2023-04-22 21:19:21.577 SparkContext: INFO: Created broadcast 208 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:19:21.577 DAGScheduler: INFO: Submitting 56 missing tasks from ResultStage 87 (MapPartitionsRDD[203] at mapPartitions at ContextRDD.scala:168) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14)) 2023-04-22 21:19:21.577 TaskSchedulerImpl: INFO: Adding task set 87.0 with 56 tasks resource profile 0 2023-04-22 21:19:21.578 TaskSetManager: INFO: Starting task 0.0 in stage 87.0 (TID 532) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:21.578 Executor: INFO: Running task 0.0 in stage 87.0 (TID 532) 2023-04-22 21:19:21.842 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 0.0 in stage 87.0 (TID 532) 2023-04-22 21:19:22.405 : INFO: TaskReport: stage=87, partition=0, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:22.405 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 0.0 in stage 87.0 (TID 532) 2023-04-22 21:19:22.406 Executor: INFO: Finished task 0.0 in stage 87.0 (TID 532). 925 bytes result sent to driver 2023-04-22 21:19:22.406 TaskSetManager: INFO: Starting task 1.0 in stage 87.0 (TID 533) (uger-c010.broadinstitute.org, executor driver, partition 1, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:22.407 TaskSetManager: INFO: Finished task 0.0 in stage 87.0 (TID 532) in 829 ms on uger-c010.broadinstitute.org (executor driver) (1/56) 2023-04-22 21:19:22.432 Executor: INFO: Running task 1.0 in stage 87.0 (TID 533) 2023-04-22 21:19:22.694 : INFO: RegionPool: initialized for thread 839: Executor task launch worker for task 1.0 in stage 87.0 (TID 533) 2023-04-22 21:19:23.182 : INFO: TaskReport: stage=87, partition=1, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:23.182 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 839: Executor task launch worker for task 1.0 in stage 87.0 (TID 533) 2023-04-22 21:19:23.182 Executor: INFO: Finished task 1.0 in stage 87.0 (TID 533). 925 bytes result sent to driver 2023-04-22 21:19:23.183 TaskSetManager: INFO: Starting task 2.0 in stage 87.0 (TID 534) (uger-c010.broadinstitute.org, executor driver, partition 2, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:23.183 TaskSetManager: INFO: Finished task 1.0 in stage 87.0 (TID 533) in 777 ms on uger-c010.broadinstitute.org (executor driver) (2/56) 2023-04-22 21:19:23.184 Executor: INFO: Running task 2.0 in stage 87.0 (TID 534) 2023-04-22 21:19:23.445 : INFO: RegionPool: initialized for thread 839: Executor task launch worker for task 2.0 in stage 87.0 (TID 534) 2023-04-22 21:19:23.924 : INFO: TaskReport: stage=87, partition=2, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:23.924 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 839: Executor task launch worker for task 2.0 in stage 87.0 (TID 534) 2023-04-22 21:19:23.924 Executor: INFO: Finished task 2.0 in stage 87.0 (TID 534). 925 bytes result sent to driver 2023-04-22 21:19:23.925 TaskSetManager: INFO: Starting task 3.0 in stage 87.0 (TID 535) (uger-c010.broadinstitute.org, executor driver, partition 3, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:23.925 TaskSetManager: INFO: Finished task 2.0 in stage 87.0 (TID 534) in 742 ms on uger-c010.broadinstitute.org (executor driver) (3/56) 2023-04-22 21:19:23.926 Executor: INFO: Running task 3.0 in stage 87.0 (TID 535) 2023-04-22 21:19:24.190 : INFO: RegionPool: initialized for thread 839: Executor task launch worker for task 3.0 in stage 87.0 (TID 535) 2023-04-22 21:19:24.694 : INFO: TaskReport: stage=87, partition=3, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:24.694 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 839: Executor task launch worker for task 3.0 in stage 87.0 (TID 535) 2023-04-22 21:19:24.694 Executor: INFO: Finished task 3.0 in stage 87.0 (TID 535). 925 bytes result sent to driver 2023-04-22 21:19:24.694 TaskSetManager: INFO: Starting task 4.0 in stage 87.0 (TID 536) (uger-c010.broadinstitute.org, executor driver, partition 4, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:24.695 TaskSetManager: INFO: Finished task 3.0 in stage 87.0 (TID 535) in 771 ms on uger-c010.broadinstitute.org (executor driver) (4/56) 2023-04-22 21:19:24.695 Executor: INFO: Running task 4.0 in stage 87.0 (TID 536) 2023-04-22 21:19:24.956 : INFO: RegionPool: initialized for thread 839: Executor task launch worker for task 4.0 in stage 87.0 (TID 536) 2023-04-22 21:19:25.452 : INFO: TaskReport: stage=87, partition=4, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:25.452 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 839: Executor task launch worker for task 4.0 in stage 87.0 (TID 536) 2023-04-22 21:19:25.453 Executor: INFO: Finished task 4.0 in stage 87.0 (TID 536). 925 bytes result sent to driver 2023-04-22 21:19:25.455 TaskSetManager: INFO: Starting task 5.0 in stage 87.0 (TID 537) (uger-c010.broadinstitute.org, executor driver, partition 5, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:25.466 TaskSetManager: INFO: Finished task 4.0 in stage 87.0 (TID 536) in 772 ms on uger-c010.broadinstitute.org (executor driver) (5/56) 2023-04-22 21:19:25.479 Executor: INFO: Running task 5.0 in stage 87.0 (TID 537) 2023-04-22 21:19:25.779 : INFO: RegionPool: initialized for thread 839: Executor task launch worker for task 5.0 in stage 87.0 (TID 537) 2023-04-22 21:19:26.257 : INFO: TaskReport: stage=87, partition=5, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:26.257 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 839: Executor task launch worker for task 5.0 in stage 87.0 (TID 537) 2023-04-22 21:19:26.258 Executor: INFO: Finished task 5.0 in stage 87.0 (TID 537). 925 bytes result sent to driver 2023-04-22 21:19:26.258 TaskSetManager: INFO: Starting task 6.0 in stage 87.0 (TID 538) (uger-c010.broadinstitute.org, executor driver, partition 6, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:26.259 TaskSetManager: INFO: Finished task 5.0 in stage 87.0 (TID 537) in 803 ms on uger-c010.broadinstitute.org (executor driver) (6/56) 2023-04-22 21:19:26.268 Executor: INFO: Running task 6.0 in stage 87.0 (TID 538) 2023-04-22 21:19:26.673 : INFO: RegionPool: initialized for thread 839: Executor task launch worker for task 6.0 in stage 87.0 (TID 538) 2023-04-22 21:19:27.158 : INFO: TaskReport: stage=87, partition=6, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:27.158 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 839: Executor task launch worker for task 6.0 in stage 87.0 (TID 538) 2023-04-22 21:19:27.158 Executor: INFO: Finished task 6.0 in stage 87.0 (TID 538). 925 bytes result sent to driver 2023-04-22 21:19:27.159 TaskSetManager: INFO: Starting task 7.0 in stage 87.0 (TID 539) (uger-c010.broadinstitute.org, executor driver, partition 7, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:27.159 TaskSetManager: INFO: Finished task 6.0 in stage 87.0 (TID 538) in 901 ms on uger-c010.broadinstitute.org (executor driver) (7/56) 2023-04-22 21:19:27.160 Executor: INFO: Running task 7.0 in stage 87.0 (TID 539) 2023-04-22 21:19:27.423 : INFO: RegionPool: initialized for thread 839: Executor task launch worker for task 7.0 in stage 87.0 (TID 539) 2023-04-22 21:19:27.899 : INFO: TaskReport: stage=87, partition=7, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:27.899 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 839: Executor task launch worker for task 7.0 in stage 87.0 (TID 539) 2023-04-22 21:19:27.899 Executor: INFO: Finished task 7.0 in stage 87.0 (TID 539). 925 bytes result sent to driver 2023-04-22 21:19:27.900 TaskSetManager: INFO: Starting task 8.0 in stage 87.0 (TID 540) (uger-c010.broadinstitute.org, executor driver, partition 8, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:27.900 TaskSetManager: INFO: Finished task 7.0 in stage 87.0 (TID 539) in 742 ms on uger-c010.broadinstitute.org (executor driver) (8/56) 2023-04-22 21:19:27.901 Executor: INFO: Running task 8.0 in stage 87.0 (TID 540) 2023-04-22 21:19:28.161 : INFO: RegionPool: initialized for thread 839: Executor task launch worker for task 8.0 in stage 87.0 (TID 540) 2023-04-22 21:19:28.668 : INFO: TaskReport: stage=87, partition=8, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:28.668 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 839: Executor task launch worker for task 8.0 in stage 87.0 (TID 540) 2023-04-22 21:19:28.669 Executor: INFO: Finished task 8.0 in stage 87.0 (TID 540). 925 bytes result sent to driver 2023-04-22 21:19:28.669 TaskSetManager: INFO: Starting task 9.0 in stage 87.0 (TID 541) (uger-c010.broadinstitute.org, executor driver, partition 9, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:28.669 TaskSetManager: INFO: Finished task 8.0 in stage 87.0 (TID 540) in 769 ms on uger-c010.broadinstitute.org (executor driver) (9/56) 2023-04-22 21:19:28.670 Executor: INFO: Running task 9.0 in stage 87.0 (TID 541) 2023-04-22 21:19:28.932 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 9.0 in stage 87.0 (TID 541) 2023-04-22 21:19:29.393 : INFO: TaskReport: stage=87, partition=9, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:29.393 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 9.0 in stage 87.0 (TID 541) 2023-04-22 21:19:29.395 Executor: INFO: Finished task 9.0 in stage 87.0 (TID 541). 968 bytes result sent to driver 2023-04-22 21:19:29.396 TaskSetManager: INFO: Starting task 10.0 in stage 87.0 (TID 542) (uger-c010.broadinstitute.org, executor driver, partition 10, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:29.396 TaskSetManager: INFO: Finished task 9.0 in stage 87.0 (TID 541) in 727 ms on uger-c010.broadinstitute.org (executor driver) (10/56) 2023-04-22 21:19:29.396 Executor: INFO: Running task 10.0 in stage 87.0 (TID 542) 2023-04-22 21:19:29.658 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 10.0 in stage 87.0 (TID 542) 2023-04-22 21:19:30.200 : INFO: TaskReport: stage=87, partition=10, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:30.200 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 10.0 in stage 87.0 (TID 542) 2023-04-22 21:19:30.201 Executor: INFO: Finished task 10.0 in stage 87.0 (TID 542). 926 bytes result sent to driver 2023-04-22 21:19:30.201 TaskSetManager: INFO: Starting task 11.0 in stage 87.0 (TID 543) (uger-c010.broadinstitute.org, executor driver, partition 11, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:30.201 TaskSetManager: INFO: Finished task 10.0 in stage 87.0 (TID 542) in 806 ms on uger-c010.broadinstitute.org (executor driver) (11/56) 2023-04-22 21:19:30.202 Executor: INFO: Running task 11.0 in stage 87.0 (TID 543) 2023-04-22 21:19:30.463 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 11.0 in stage 87.0 (TID 543) 2023-04-22 21:19:30.951 : INFO: TaskReport: stage=87, partition=11, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:30.951 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 11.0 in stage 87.0 (TID 543) 2023-04-22 21:19:30.951 Executor: INFO: Finished task 11.0 in stage 87.0 (TID 543). 926 bytes result sent to driver 2023-04-22 21:19:30.951 TaskSetManager: INFO: Starting task 12.0 in stage 87.0 (TID 544) (uger-c010.broadinstitute.org, executor driver, partition 12, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:30.952 TaskSetManager: INFO: Finished task 11.0 in stage 87.0 (TID 543) in 751 ms on uger-c010.broadinstitute.org (executor driver) (12/56) 2023-04-22 21:19:30.952 Executor: INFO: Running task 12.0 in stage 87.0 (TID 544) 2023-04-22 21:19:31.212 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 12.0 in stage 87.0 (TID 544) 2023-04-22 21:19:31.707 : INFO: TaskReport: stage=87, partition=12, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:31.707 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 12.0 in stage 87.0 (TID 544) 2023-04-22 21:19:31.707 Executor: INFO: Finished task 12.0 in stage 87.0 (TID 544). 926 bytes result sent to driver 2023-04-22 21:19:31.708 TaskSetManager: INFO: Starting task 13.0 in stage 87.0 (TID 545) (uger-c010.broadinstitute.org, executor driver, partition 13, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:31.708 TaskSetManager: INFO: Finished task 12.0 in stage 87.0 (TID 544) in 757 ms on uger-c010.broadinstitute.org (executor driver) (13/56) 2023-04-22 21:19:31.708 Executor: INFO: Running task 13.0 in stage 87.0 (TID 545) 2023-04-22 21:19:31.968 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 13.0 in stage 87.0 (TID 545) 2023-04-22 21:19:32.524 : INFO: TaskReport: stage=87, partition=13, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:32.524 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 13.0 in stage 87.0 (TID 545) 2023-04-22 21:19:32.524 Executor: INFO: Finished task 13.0 in stage 87.0 (TID 545). 926 bytes result sent to driver 2023-04-22 21:19:32.524 TaskSetManager: INFO: Starting task 14.0 in stage 87.0 (TID 546) (uger-c010.broadinstitute.org, executor driver, partition 14, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:32.525 TaskSetManager: INFO: Finished task 13.0 in stage 87.0 (TID 545) in 818 ms on uger-c010.broadinstitute.org (executor driver) (14/56) 2023-04-22 21:19:32.525 Executor: INFO: Running task 14.0 in stage 87.0 (TID 546) 2023-04-22 21:19:32.787 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 14.0 in stage 87.0 (TID 546) 2023-04-22 21:19:33.261 : INFO: TaskReport: stage=87, partition=14, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:33.262 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 14.0 in stage 87.0 (TID 546) 2023-04-22 21:19:33.262 Executor: INFO: Finished task 14.0 in stage 87.0 (TID 546). 926 bytes result sent to driver 2023-04-22 21:19:33.262 TaskSetManager: INFO: Starting task 15.0 in stage 87.0 (TID 547) (uger-c010.broadinstitute.org, executor driver, partition 15, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:33.263 TaskSetManager: INFO: Finished task 14.0 in stage 87.0 (TID 546) in 739 ms on uger-c010.broadinstitute.org (executor driver) (15/56) 2023-04-22 21:19:33.263 Executor: INFO: Running task 15.0 in stage 87.0 (TID 547) 2023-04-22 21:19:33.748 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 15.0 in stage 87.0 (TID 547) 2023-04-22 21:19:34.233 : INFO: TaskReport: stage=87, partition=15, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:34.233 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 15.0 in stage 87.0 (TID 547) 2023-04-22 21:19:34.234 Executor: INFO: Finished task 15.0 in stage 87.0 (TID 547). 969 bytes result sent to driver 2023-04-22 21:19:34.234 TaskSetManager: INFO: Starting task 16.0 in stage 87.0 (TID 548) (uger-c010.broadinstitute.org, executor driver, partition 16, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:34.235 TaskSetManager: INFO: Finished task 15.0 in stage 87.0 (TID 547) in 973 ms on uger-c010.broadinstitute.org (executor driver) (16/56) 2023-04-22 21:19:34.235 Executor: INFO: Running task 16.0 in stage 87.0 (TID 548) 2023-04-22 21:19:34.495 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 16.0 in stage 87.0 (TID 548) 2023-04-22 21:19:35.026 : INFO: TaskReport: stage=87, partition=16, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:35.026 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 16.0 in stage 87.0 (TID 548) 2023-04-22 21:19:35.027 Executor: INFO: Finished task 16.0 in stage 87.0 (TID 548). 926 bytes result sent to driver 2023-04-22 21:19:35.027 TaskSetManager: INFO: Starting task 17.0 in stage 87.0 (TID 549) (uger-c010.broadinstitute.org, executor driver, partition 17, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:35.027 TaskSetManager: INFO: Finished task 16.0 in stage 87.0 (TID 548) in 793 ms on uger-c010.broadinstitute.org (executor driver) (17/56) 2023-04-22 21:19:35.028 Executor: INFO: Running task 17.0 in stage 87.0 (TID 549) 2023-04-22 21:19:35.288 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 17.0 in stage 87.0 (TID 549) 2023-04-22 21:19:35.791 : INFO: TaskReport: stage=87, partition=17, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:35.791 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 17.0 in stage 87.0 (TID 549) 2023-04-22 21:19:35.792 Executor: INFO: Finished task 17.0 in stage 87.0 (TID 549). 926 bytes result sent to driver 2023-04-22 21:19:35.792 TaskSetManager: INFO: Starting task 18.0 in stage 87.0 (TID 550) (uger-c010.broadinstitute.org, executor driver, partition 18, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:35.793 TaskSetManager: INFO: Finished task 17.0 in stage 87.0 (TID 549) in 766 ms on uger-c010.broadinstitute.org (executor driver) (18/56) 2023-04-22 21:19:35.793 Executor: INFO: Running task 18.0 in stage 87.0 (TID 550) 2023-04-22 21:19:36.053 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 18.0 in stage 87.0 (TID 550) 2023-04-22 21:19:36.542 : INFO: TaskReport: stage=87, partition=18, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:36.542 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 18.0 in stage 87.0 (TID 550) 2023-04-22 21:19:36.543 Executor: INFO: Finished task 18.0 in stage 87.0 (TID 550). 926 bytes result sent to driver 2023-04-22 21:19:36.543 TaskSetManager: INFO: Starting task 19.0 in stage 87.0 (TID 551) (uger-c010.broadinstitute.org, executor driver, partition 19, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:36.544 TaskSetManager: INFO: Finished task 18.0 in stage 87.0 (TID 550) in 752 ms on uger-c010.broadinstitute.org (executor driver) (19/56) 2023-04-22 21:19:36.544 Executor: INFO: Running task 19.0 in stage 87.0 (TID 551) 2023-04-22 21:19:36.804 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 19.0 in stage 87.0 (TID 551) 2023-04-22 21:19:37.303 : INFO: TaskReport: stage=87, partition=19, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:37.303 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 19.0 in stage 87.0 (TID 551) 2023-04-22 21:19:37.304 Executor: INFO: Finished task 19.0 in stage 87.0 (TID 551). 926 bytes result sent to driver 2023-04-22 21:19:37.304 TaskSetManager: INFO: Starting task 20.0 in stage 87.0 (TID 552) (uger-c010.broadinstitute.org, executor driver, partition 20, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:37.305 TaskSetManager: INFO: Finished task 19.0 in stage 87.0 (TID 551) in 761 ms on uger-c010.broadinstitute.org (executor driver) (20/56) 2023-04-22 21:19:37.307 Executor: INFO: Running task 20.0 in stage 87.0 (TID 552) 2023-04-22 21:19:37.567 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 20.0 in stage 87.0 (TID 552) 2023-04-22 21:19:38.046 : INFO: TaskReport: stage=87, partition=20, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:38.047 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 20.0 in stage 87.0 (TID 552) 2023-04-22 21:19:38.047 Executor: INFO: Finished task 20.0 in stage 87.0 (TID 552). 926 bytes result sent to driver 2023-04-22 21:19:38.047 TaskSetManager: INFO: Starting task 21.0 in stage 87.0 (TID 553) (uger-c010.broadinstitute.org, executor driver, partition 21, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:38.048 TaskSetManager: INFO: Finished task 20.0 in stage 87.0 (TID 552) in 744 ms on uger-c010.broadinstitute.org (executor driver) (21/56) 2023-04-22 21:19:38.048 Executor: INFO: Running task 21.0 in stage 87.0 (TID 553) 2023-04-22 21:19:38.308 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 21.0 in stage 87.0 (TID 553) 2023-04-22 21:19:38.790 : INFO: TaskReport: stage=87, partition=21, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:38.790 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 21.0 in stage 87.0 (TID 553) 2023-04-22 21:19:38.791 Executor: INFO: Finished task 21.0 in stage 87.0 (TID 553). 926 bytes result sent to driver 2023-04-22 21:19:38.791 TaskSetManager: INFO: Starting task 22.0 in stage 87.0 (TID 554) (uger-c010.broadinstitute.org, executor driver, partition 22, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:38.791 TaskSetManager: INFO: Finished task 21.0 in stage 87.0 (TID 553) in 744 ms on uger-c010.broadinstitute.org (executor driver) (22/56) 2023-04-22 21:19:38.792 Executor: INFO: Running task 22.0 in stage 87.0 (TID 554) 2023-04-22 21:19:39.056 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 22.0 in stage 87.0 (TID 554) 2023-04-22 21:19:39.556 : INFO: TaskReport: stage=87, partition=22, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:39.556 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 22.0 in stage 87.0 (TID 554) 2023-04-22 21:19:39.557 Executor: INFO: Finished task 22.0 in stage 87.0 (TID 554). 926 bytes result sent to driver 2023-04-22 21:19:39.557 TaskSetManager: INFO: Starting task 23.0 in stage 87.0 (TID 555) (uger-c010.broadinstitute.org, executor driver, partition 23, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:39.557 TaskSetManager: INFO: Finished task 22.0 in stage 87.0 (TID 554) in 766 ms on uger-c010.broadinstitute.org (executor driver) (23/56) 2023-04-22 21:19:39.558 Executor: INFO: Running task 23.0 in stage 87.0 (TID 555) 2023-04-22 21:19:39.819 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 23.0 in stage 87.0 (TID 555) 2023-04-22 21:19:40.292 : INFO: TaskReport: stage=87, partition=23, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:40.292 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 23.0 in stage 87.0 (TID 555) 2023-04-22 21:19:40.293 Executor: INFO: Finished task 23.0 in stage 87.0 (TID 555). 926 bytes result sent to driver 2023-04-22 21:19:40.293 TaskSetManager: INFO: Starting task 24.0 in stage 87.0 (TID 556) (uger-c010.broadinstitute.org, executor driver, partition 24, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:40.293 TaskSetManager: INFO: Finished task 23.0 in stage 87.0 (TID 555) in 736 ms on uger-c010.broadinstitute.org (executor driver) (24/56) 2023-04-22 21:19:40.295 Executor: INFO: Running task 24.0 in stage 87.0 (TID 556) 2023-04-22 21:19:40.556 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 24.0 in stage 87.0 (TID 556) 2023-04-22 21:19:41.160 : INFO: TaskReport: stage=87, partition=24, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:41.161 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 24.0 in stage 87.0 (TID 556) 2023-04-22 21:19:41.161 Executor: INFO: Finished task 24.0 in stage 87.0 (TID 556). 926 bytes result sent to driver 2023-04-22 21:19:41.161 TaskSetManager: INFO: Starting task 25.0 in stage 87.0 (TID 557) (uger-c010.broadinstitute.org, executor driver, partition 25, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:41.162 TaskSetManager: INFO: Finished task 24.0 in stage 87.0 (TID 556) in 869 ms on uger-c010.broadinstitute.org (executor driver) (25/56) 2023-04-22 21:19:41.162 Executor: INFO: Running task 25.0 in stage 87.0 (TID 557) 2023-04-22 21:19:41.600 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 25.0 in stage 87.0 (TID 557) 2023-04-22 21:19:42.088 : INFO: TaskReport: stage=87, partition=25, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:42.088 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 25.0 in stage 87.0 (TID 557) 2023-04-22 21:19:42.088 Executor: INFO: Finished task 25.0 in stage 87.0 (TID 557). 926 bytes result sent to driver 2023-04-22 21:19:42.089 TaskSetManager: INFO: Starting task 26.0 in stage 87.0 (TID 558) (uger-c010.broadinstitute.org, executor driver, partition 26, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:42.089 TaskSetManager: INFO: Finished task 25.0 in stage 87.0 (TID 557) in 928 ms on uger-c010.broadinstitute.org (executor driver) (26/56) 2023-04-22 21:19:42.090 Executor: INFO: Running task 26.0 in stage 87.0 (TID 558) 2023-04-22 21:19:42.353 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 26.0 in stage 87.0 (TID 558) 2023-04-22 21:19:42.882 : INFO: TaskReport: stage=87, partition=26, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:42.882 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 26.0 in stage 87.0 (TID 558) 2023-04-22 21:19:42.883 Executor: INFO: Finished task 26.0 in stage 87.0 (TID 558). 926 bytes result sent to driver 2023-04-22 21:19:42.883 TaskSetManager: INFO: Starting task 27.0 in stage 87.0 (TID 559) (uger-c010.broadinstitute.org, executor driver, partition 27, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:42.883 TaskSetManager: INFO: Finished task 26.0 in stage 87.0 (TID 558) in 794 ms on uger-c010.broadinstitute.org (executor driver) (27/56) 2023-04-22 21:19:42.886 Executor: INFO: Running task 27.0 in stage 87.0 (TID 559) 2023-04-22 21:19:43.139 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 27.0 in stage 87.0 (TID 559) 2023-04-22 21:19:43.662 : INFO: TaskReport: stage=87, partition=27, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:43.663 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 27.0 in stage 87.0 (TID 559) 2023-04-22 21:19:43.663 Executor: INFO: Finished task 27.0 in stage 87.0 (TID 559). 925 bytes result sent to driver 2023-04-22 21:19:43.663 TaskSetManager: INFO: Starting task 28.0 in stage 87.0 (TID 560) (uger-c010.broadinstitute.org, executor driver, partition 28, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:43.664 TaskSetManager: INFO: Finished task 27.0 in stage 87.0 (TID 559) in 781 ms on uger-c010.broadinstitute.org (executor driver) (28/56) 2023-04-22 21:19:43.664 Executor: INFO: Running task 28.0 in stage 87.0 (TID 560) 2023-04-22 21:19:43.677 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 28.0 in stage 87.0 (TID 560) 2023-04-22 21:19:43.699 : INFO: TaskReport: stage=87, partition=28, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:43.699 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 28.0 in stage 87.0 (TID 560) 2023-04-22 21:19:43.699 Executor: INFO: Finished task 28.0 in stage 87.0 (TID 560). 925 bytes result sent to driver 2023-04-22 21:19:43.699 TaskSetManager: INFO: Starting task 29.0 in stage 87.0 (TID 561) (uger-c010.broadinstitute.org, executor driver, partition 29, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:43.700 TaskSetManager: INFO: Finished task 28.0 in stage 87.0 (TID 560) in 37 ms on uger-c010.broadinstitute.org (executor driver) (29/56) 2023-04-22 21:19:43.700 Executor: INFO: Running task 29.0 in stage 87.0 (TID 561) 2023-04-22 21:19:43.712 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 29.0 in stage 87.0 (TID 561) 2023-04-22 21:19:43.732 : INFO: TaskReport: stage=87, partition=29, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:43.732 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 29.0 in stage 87.0 (TID 561) 2023-04-22 21:19:43.732 Executor: INFO: Finished task 29.0 in stage 87.0 (TID 561). 925 bytes result sent to driver 2023-04-22 21:19:43.732 TaskSetManager: INFO: Starting task 30.0 in stage 87.0 (TID 562) (uger-c010.broadinstitute.org, executor driver, partition 30, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:43.732 TaskSetManager: INFO: Finished task 29.0 in stage 87.0 (TID 561) in 33 ms on uger-c010.broadinstitute.org (executor driver) (30/56) 2023-04-22 21:19:43.733 Executor: INFO: Running task 30.0 in stage 87.0 (TID 562) 2023-04-22 21:19:43.744 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 30.0 in stage 87.0 (TID 562) 2023-04-22 21:19:43.763 : INFO: TaskReport: stage=87, partition=30, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:43.763 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 30.0 in stage 87.0 (TID 562) 2023-04-22 21:19:43.763 Executor: INFO: Finished task 30.0 in stage 87.0 (TID 562). 925 bytes result sent to driver 2023-04-22 21:19:43.764 TaskSetManager: INFO: Starting task 31.0 in stage 87.0 (TID 563) (uger-c010.broadinstitute.org, executor driver, partition 31, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:43.764 TaskSetManager: INFO: Finished task 30.0 in stage 87.0 (TID 562) in 32 ms on uger-c010.broadinstitute.org (executor driver) (31/56) 2023-04-22 21:19:43.764 Executor: INFO: Running task 31.0 in stage 87.0 (TID 563) 2023-04-22 21:19:43.775 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 31.0 in stage 87.0 (TID 563) 2023-04-22 21:19:43.797 : INFO: TaskReport: stage=87, partition=31, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:43.797 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 31.0 in stage 87.0 (TID 563) 2023-04-22 21:19:43.797 Executor: INFO: Finished task 31.0 in stage 87.0 (TID 563). 925 bytes result sent to driver 2023-04-22 21:19:43.797 TaskSetManager: INFO: Starting task 32.0 in stage 87.0 (TID 564) (uger-c010.broadinstitute.org, executor driver, partition 32, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:43.798 TaskSetManager: INFO: Finished task 31.0 in stage 87.0 (TID 563) in 35 ms on uger-c010.broadinstitute.org (executor driver) (32/56) 2023-04-22 21:19:43.798 Executor: INFO: Running task 32.0 in stage 87.0 (TID 564) 2023-04-22 21:19:43.809 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 32.0 in stage 87.0 (TID 564) 2023-04-22 21:19:43.830 : INFO: TaskReport: stage=87, partition=32, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:43.830 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 32.0 in stage 87.0 (TID 564) 2023-04-22 21:19:43.830 Executor: INFO: Finished task 32.0 in stage 87.0 (TID 564). 925 bytes result sent to driver 2023-04-22 21:19:43.830 TaskSetManager: INFO: Starting task 33.0 in stage 87.0 (TID 565) (uger-c010.broadinstitute.org, executor driver, partition 33, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:43.830 TaskSetManager: INFO: Finished task 32.0 in stage 87.0 (TID 564) in 33 ms on uger-c010.broadinstitute.org (executor driver) (33/56) 2023-04-22 21:19:43.831 Executor: INFO: Running task 33.0 in stage 87.0 (TID 565) 2023-04-22 21:19:43.842 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 33.0 in stage 87.0 (TID 565) 2023-04-22 21:19:43.863 : INFO: TaskReport: stage=87, partition=33, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:43.863 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 33.0 in stage 87.0 (TID 565) 2023-04-22 21:19:43.863 Executor: INFO: Finished task 33.0 in stage 87.0 (TID 565). 925 bytes result sent to driver 2023-04-22 21:19:43.863 TaskSetManager: INFO: Starting task 34.0 in stage 87.0 (TID 566) (uger-c010.broadinstitute.org, executor driver, partition 34, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:43.863 TaskSetManager: INFO: Finished task 33.0 in stage 87.0 (TID 565) in 33 ms on uger-c010.broadinstitute.org (executor driver) (34/56) 2023-04-22 21:19:43.864 Executor: INFO: Running task 34.0 in stage 87.0 (TID 566) 2023-04-22 21:19:43.876 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 34.0 in stage 87.0 (TID 566) 2023-04-22 21:19:43.897 : INFO: TaskReport: stage=87, partition=34, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:43.897 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 34.0 in stage 87.0 (TID 566) 2023-04-22 21:19:43.897 Executor: INFO: Finished task 34.0 in stage 87.0 (TID 566). 925 bytes result sent to driver 2023-04-22 21:19:43.897 TaskSetManager: INFO: Starting task 35.0 in stage 87.0 (TID 567) (uger-c010.broadinstitute.org, executor driver, partition 35, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:43.897 TaskSetManager: INFO: Finished task 34.0 in stage 87.0 (TID 566) in 34 ms on uger-c010.broadinstitute.org (executor driver) (35/56) 2023-04-22 21:19:43.898 Executor: INFO: Running task 35.0 in stage 87.0 (TID 567) 2023-04-22 21:19:43.909 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 35.0 in stage 87.0 (TID 567) 2023-04-22 21:19:43.929 : INFO: TaskReport: stage=87, partition=35, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:43.929 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 35.0 in stage 87.0 (TID 567) 2023-04-22 21:19:43.929 Executor: INFO: Finished task 35.0 in stage 87.0 (TID 567). 925 bytes result sent to driver 2023-04-22 21:19:43.929 TaskSetManager: INFO: Starting task 36.0 in stage 87.0 (TID 568) (uger-c010.broadinstitute.org, executor driver, partition 36, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:43.929 TaskSetManager: INFO: Finished task 35.0 in stage 87.0 (TID 567) in 32 ms on uger-c010.broadinstitute.org (executor driver) (36/56) 2023-04-22 21:19:43.930 Executor: INFO: Running task 36.0 in stage 87.0 (TID 568) 2023-04-22 21:19:43.942 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 36.0 in stage 87.0 (TID 568) 2023-04-22 21:19:43.963 : INFO: TaskReport: stage=87, partition=36, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:43.963 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 36.0 in stage 87.0 (TID 568) 2023-04-22 21:19:43.963 Executor: INFO: Finished task 36.0 in stage 87.0 (TID 568). 925 bytes result sent to driver 2023-04-22 21:19:43.963 TaskSetManager: INFO: Starting task 37.0 in stage 87.0 (TID 569) (uger-c010.broadinstitute.org, executor driver, partition 37, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:43.964 TaskSetManager: INFO: Finished task 36.0 in stage 87.0 (TID 568) in 35 ms on uger-c010.broadinstitute.org (executor driver) (37/56) 2023-04-22 21:19:43.964 Executor: INFO: Running task 37.0 in stage 87.0 (TID 569) 2023-04-22 21:19:43.975 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 37.0 in stage 87.0 (TID 569) 2023-04-22 21:19:43.997 : INFO: TaskReport: stage=87, partition=37, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:43.997 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 37.0 in stage 87.0 (TID 569) 2023-04-22 21:19:43.997 Executor: INFO: Finished task 37.0 in stage 87.0 (TID 569). 925 bytes result sent to driver 2023-04-22 21:19:43.998 TaskSetManager: INFO: Starting task 38.0 in stage 87.0 (TID 570) (uger-c010.broadinstitute.org, executor driver, partition 38, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:43.998 TaskSetManager: INFO: Finished task 37.0 in stage 87.0 (TID 569) in 35 ms on uger-c010.broadinstitute.org (executor driver) (38/56) 2023-04-22 21:19:43.998 Executor: INFO: Running task 38.0 in stage 87.0 (TID 570) 2023-04-22 21:19:44.013 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 38.0 in stage 87.0 (TID 570) 2023-04-22 21:19:44.033 : INFO: TaskReport: stage=87, partition=38, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.033 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 38.0 in stage 87.0 (TID 570) 2023-04-22 21:19:44.033 Executor: INFO: Finished task 38.0 in stage 87.0 (TID 570). 925 bytes result sent to driver 2023-04-22 21:19:44.033 TaskSetManager: INFO: Starting task 39.0 in stage 87.0 (TID 571) (uger-c010.broadinstitute.org, executor driver, partition 39, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.033 TaskSetManager: INFO: Finished task 38.0 in stage 87.0 (TID 570) in 35 ms on uger-c010.broadinstitute.org (executor driver) (39/56) 2023-04-22 21:19:44.034 Executor: INFO: Running task 39.0 in stage 87.0 (TID 571) 2023-04-22 21:19:44.045 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 39.0 in stage 87.0 (TID 571) 2023-04-22 21:19:44.064 : INFO: TaskReport: stage=87, partition=39, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.064 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 39.0 in stage 87.0 (TID 571) 2023-04-22 21:19:44.064 Executor: INFO: Finished task 39.0 in stage 87.0 (TID 571). 925 bytes result sent to driver 2023-04-22 21:19:44.064 TaskSetManager: INFO: Starting task 40.0 in stage 87.0 (TID 572) (uger-c010.broadinstitute.org, executor driver, partition 40, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.065 TaskSetManager: INFO: Finished task 39.0 in stage 87.0 (TID 571) in 32 ms on uger-c010.broadinstitute.org (executor driver) (40/56) 2023-04-22 21:19:44.065 Executor: INFO: Running task 40.0 in stage 87.0 (TID 572) 2023-04-22 21:19:44.076 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 40.0 in stage 87.0 (TID 572) 2023-04-22 21:19:44.096 : INFO: TaskReport: stage=87, partition=40, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.096 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 40.0 in stage 87.0 (TID 572) 2023-04-22 21:19:44.096 Executor: INFO: Finished task 40.0 in stage 87.0 (TID 572). 925 bytes result sent to driver 2023-04-22 21:19:44.097 TaskSetManager: INFO: Starting task 41.0 in stage 87.0 (TID 573) (uger-c010.broadinstitute.org, executor driver, partition 41, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.097 TaskSetManager: INFO: Finished task 40.0 in stage 87.0 (TID 572) in 33 ms on uger-c010.broadinstitute.org (executor driver) (41/56) 2023-04-22 21:19:44.097 Executor: INFO: Running task 41.0 in stage 87.0 (TID 573) 2023-04-22 21:19:44.108 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 41.0 in stage 87.0 (TID 573) 2023-04-22 21:19:44.128 : INFO: TaskReport: stage=87, partition=41, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.128 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 41.0 in stage 87.0 (TID 573) 2023-04-22 21:19:44.128 Executor: INFO: Finished task 41.0 in stage 87.0 (TID 573). 925 bytes result sent to driver 2023-04-22 21:19:44.129 TaskSetManager: INFO: Starting task 42.0 in stage 87.0 (TID 574) (uger-c010.broadinstitute.org, executor driver, partition 42, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.129 TaskSetManager: INFO: Finished task 41.0 in stage 87.0 (TID 573) in 32 ms on uger-c010.broadinstitute.org (executor driver) (42/56) 2023-04-22 21:19:44.129 Executor: INFO: Running task 42.0 in stage 87.0 (TID 574) 2023-04-22 21:19:44.140 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 42.0 in stage 87.0 (TID 574) 2023-04-22 21:19:44.159 : INFO: TaskReport: stage=87, partition=42, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.159 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 42.0 in stage 87.0 (TID 574) 2023-04-22 21:19:44.159 Executor: INFO: Finished task 42.0 in stage 87.0 (TID 574). 925 bytes result sent to driver 2023-04-22 21:19:44.159 TaskSetManager: INFO: Starting task 43.0 in stage 87.0 (TID 575) (uger-c010.broadinstitute.org, executor driver, partition 43, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.160 TaskSetManager: INFO: Finished task 42.0 in stage 87.0 (TID 574) in 32 ms on uger-c010.broadinstitute.org (executor driver) (43/56) 2023-04-22 21:19:44.160 Executor: INFO: Running task 43.0 in stage 87.0 (TID 575) 2023-04-22 21:19:44.174 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 43.0 in stage 87.0 (TID 575) 2023-04-22 21:19:44.193 : INFO: TaskReport: stage=87, partition=43, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.193 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 43.0 in stage 87.0 (TID 575) 2023-04-22 21:19:44.193 Executor: INFO: Finished task 43.0 in stage 87.0 (TID 575). 925 bytes result sent to driver 2023-04-22 21:19:44.194 TaskSetManager: INFO: Starting task 44.0 in stage 87.0 (TID 576) (uger-c010.broadinstitute.org, executor driver, partition 44, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.194 TaskSetManager: INFO: Finished task 43.0 in stage 87.0 (TID 575) in 35 ms on uger-c010.broadinstitute.org (executor driver) (44/56) 2023-04-22 21:19:44.195 Executor: INFO: Running task 44.0 in stage 87.0 (TID 576) 2023-04-22 21:19:44.208 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 44.0 in stage 87.0 (TID 576) 2023-04-22 21:19:44.227 : INFO: TaskReport: stage=87, partition=44, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.227 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 44.0 in stage 87.0 (TID 576) 2023-04-22 21:19:44.228 Executor: INFO: Finished task 44.0 in stage 87.0 (TID 576). 925 bytes result sent to driver 2023-04-22 21:19:44.228 TaskSetManager: INFO: Starting task 45.0 in stage 87.0 (TID 577) (uger-c010.broadinstitute.org, executor driver, partition 45, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.228 TaskSetManager: INFO: Finished task 44.0 in stage 87.0 (TID 576) in 34 ms on uger-c010.broadinstitute.org (executor driver) (45/56) 2023-04-22 21:19:44.228 Executor: INFO: Running task 45.0 in stage 87.0 (TID 577) 2023-04-22 21:19:44.239 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 45.0 in stage 87.0 (TID 577) 2023-04-22 21:19:44.259 : INFO: TaskReport: stage=87, partition=45, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.259 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 45.0 in stage 87.0 (TID 577) 2023-04-22 21:19:44.259 Executor: INFO: Finished task 45.0 in stage 87.0 (TID 577). 925 bytes result sent to driver 2023-04-22 21:19:44.259 TaskSetManager: INFO: Starting task 46.0 in stage 87.0 (TID 578) (uger-c010.broadinstitute.org, executor driver, partition 46, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.259 TaskSetManager: INFO: Finished task 45.0 in stage 87.0 (TID 577) in 31 ms on uger-c010.broadinstitute.org (executor driver) (46/56) 2023-04-22 21:19:44.260 Executor: INFO: Running task 46.0 in stage 87.0 (TID 578) 2023-04-22 21:19:44.271 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 46.0 in stage 87.0 (TID 578) 2023-04-22 21:19:44.292 : INFO: TaskReport: stage=87, partition=46, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.292 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 46.0 in stage 87.0 (TID 578) 2023-04-22 21:19:44.292 Executor: INFO: Finished task 46.0 in stage 87.0 (TID 578). 925 bytes result sent to driver 2023-04-22 21:19:44.292 TaskSetManager: INFO: Starting task 47.0 in stage 87.0 (TID 579) (uger-c010.broadinstitute.org, executor driver, partition 47, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.292 TaskSetManager: INFO: Finished task 46.0 in stage 87.0 (TID 578) in 33 ms on uger-c010.broadinstitute.org (executor driver) (47/56) 2023-04-22 21:19:44.293 Executor: INFO: Running task 47.0 in stage 87.0 (TID 579) 2023-04-22 21:19:44.304 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 47.0 in stage 87.0 (TID 579) 2023-04-22 21:19:44.324 : INFO: TaskReport: stage=87, partition=47, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.324 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 47.0 in stage 87.0 (TID 579) 2023-04-22 21:19:44.324 Executor: INFO: Finished task 47.0 in stage 87.0 (TID 579). 925 bytes result sent to driver 2023-04-22 21:19:44.324 TaskSetManager: INFO: Starting task 48.0 in stage 87.0 (TID 580) (uger-c010.broadinstitute.org, executor driver, partition 48, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.324 TaskSetManager: INFO: Finished task 47.0 in stage 87.0 (TID 579) in 32 ms on uger-c010.broadinstitute.org (executor driver) (48/56) 2023-04-22 21:19:44.325 Executor: INFO: Running task 48.0 in stage 87.0 (TID 580) 2023-04-22 21:19:44.336 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 48.0 in stage 87.0 (TID 580) 2023-04-22 21:19:44.355 : INFO: TaskReport: stage=87, partition=48, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.355 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 48.0 in stage 87.0 (TID 580) 2023-04-22 21:19:44.356 Executor: INFO: Finished task 48.0 in stage 87.0 (TID 580). 925 bytes result sent to driver 2023-04-22 21:19:44.356 TaskSetManager: INFO: Starting task 49.0 in stage 87.0 (TID 581) (uger-c010.broadinstitute.org, executor driver, partition 49, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.356 TaskSetManager: INFO: Finished task 48.0 in stage 87.0 (TID 580) in 32 ms on uger-c010.broadinstitute.org (executor driver) (49/56) 2023-04-22 21:19:44.356 Executor: INFO: Running task 49.0 in stage 87.0 (TID 581) 2023-04-22 21:19:44.367 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 49.0 in stage 87.0 (TID 581) 2023-04-22 21:19:44.387 : INFO: TaskReport: stage=87, partition=49, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.387 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 49.0 in stage 87.0 (TID 581) 2023-04-22 21:19:44.387 Executor: INFO: Finished task 49.0 in stage 87.0 (TID 581). 925 bytes result sent to driver 2023-04-22 21:19:44.387 TaskSetManager: INFO: Starting task 50.0 in stage 87.0 (TID 582) (uger-c010.broadinstitute.org, executor driver, partition 50, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.388 TaskSetManager: INFO: Finished task 49.0 in stage 87.0 (TID 581) in 32 ms on uger-c010.broadinstitute.org (executor driver) (50/56) 2023-04-22 21:19:44.388 Executor: INFO: Running task 50.0 in stage 87.0 (TID 582) 2023-04-22 21:19:44.398 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 50.0 in stage 87.0 (TID 582) 2023-04-22 21:19:44.418 : INFO: TaskReport: stage=87, partition=50, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.418 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 50.0 in stage 87.0 (TID 582) 2023-04-22 21:19:44.418 Executor: INFO: Finished task 50.0 in stage 87.0 (TID 582). 925 bytes result sent to driver 2023-04-22 21:19:44.418 TaskSetManager: INFO: Starting task 51.0 in stage 87.0 (TID 583) (uger-c010.broadinstitute.org, executor driver, partition 51, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.419 TaskSetManager: INFO: Finished task 50.0 in stage 87.0 (TID 582) in 32 ms on uger-c010.broadinstitute.org (executor driver) (51/56) 2023-04-22 21:19:44.419 Executor: INFO: Running task 51.0 in stage 87.0 (TID 583) 2023-04-22 21:19:44.429 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 51.0 in stage 87.0 (TID 583) 2023-04-22 21:19:44.453 : INFO: TaskReport: stage=87, partition=51, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.453 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 51.0 in stage 87.0 (TID 583) 2023-04-22 21:19:44.453 Executor: INFO: Finished task 51.0 in stage 87.0 (TID 583). 925 bytes result sent to driver 2023-04-22 21:19:44.453 TaskSetManager: INFO: Starting task 52.0 in stage 87.0 (TID 584) (uger-c010.broadinstitute.org, executor driver, partition 52, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.453 TaskSetManager: INFO: Finished task 51.0 in stage 87.0 (TID 583) in 35 ms on uger-c010.broadinstitute.org (executor driver) (52/56) 2023-04-22 21:19:44.462 Executor: INFO: Running task 52.0 in stage 87.0 (TID 584) 2023-04-22 21:19:44.517 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 52.0 in stage 87.0 (TID 584) 2023-04-22 21:19:44.546 : INFO: TaskReport: stage=87, partition=52, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.546 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 52.0 in stage 87.0 (TID 584) 2023-04-22 21:19:44.546 Executor: INFO: Finished task 52.0 in stage 87.0 (TID 584). 925 bytes result sent to driver 2023-04-22 21:19:44.546 TaskSetManager: INFO: Starting task 53.0 in stage 87.0 (TID 585) (uger-c010.broadinstitute.org, executor driver, partition 53, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.547 TaskSetManager: INFO: Finished task 52.0 in stage 87.0 (TID 584) in 94 ms on uger-c010.broadinstitute.org (executor driver) (53/56) 2023-04-22 21:19:44.547 Executor: INFO: Running task 53.0 in stage 87.0 (TID 585) 2023-04-22 21:19:44.558 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 53.0 in stage 87.0 (TID 585) 2023-04-22 21:19:44.579 : INFO: TaskReport: stage=87, partition=53, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.579 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 53.0 in stage 87.0 (TID 585) 2023-04-22 21:19:44.579 Executor: INFO: Finished task 53.0 in stage 87.0 (TID 585). 925 bytes result sent to driver 2023-04-22 21:19:44.580 TaskSetManager: INFO: Starting task 54.0 in stage 87.0 (TID 586) (uger-c010.broadinstitute.org, executor driver, partition 54, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.580 TaskSetManager: INFO: Finished task 53.0 in stage 87.0 (TID 585) in 34 ms on uger-c010.broadinstitute.org (executor driver) (54/56) 2023-04-22 21:19:44.580 Executor: INFO: Running task 54.0 in stage 87.0 (TID 586) 2023-04-22 21:19:44.591 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 54.0 in stage 87.0 (TID 586) 2023-04-22 21:19:44.610 : INFO: TaskReport: stage=87, partition=54, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.610 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 54.0 in stage 87.0 (TID 586) 2023-04-22 21:19:44.610 Executor: INFO: Finished task 54.0 in stage 87.0 (TID 586). 925 bytes result sent to driver 2023-04-22 21:19:44.610 TaskSetManager: INFO: Starting task 55.0 in stage 87.0 (TID 587) (uger-c010.broadinstitute.org, executor driver, partition 55, PROCESS_LOCAL, 4362 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.610 TaskSetManager: INFO: Finished task 54.0 in stage 87.0 (TID 586) in 31 ms on uger-c010.broadinstitute.org (executor driver) (55/56) 2023-04-22 21:19:44.612 Executor: INFO: Running task 55.0 in stage 87.0 (TID 587) 2023-04-22 21:19:44.623 : INFO: RegionPool: initialized for thread 540: Executor task launch worker for task 55.0 in stage 87.0 (TID 587) 2023-04-22 21:19:44.642 : INFO: TaskReport: stage=87, partition=55, attempt=0, peakBytes=131072, peakBytesReadable=128.00 KiB, chunks requested=0, cache hits=0 2023-04-22 21:19:44.642 : INFO: RegionPool: FREE: 128.0K allocated (128.0K blocks / 0 chunks), regions.size = 2, 0 current java objects, thread 540: Executor task launch worker for task 55.0 in stage 87.0 (TID 587) 2023-04-22 21:19:44.642 Executor: INFO: Finished task 55.0 in stage 87.0 (TID 587). 925 bytes result sent to driver 2023-04-22 21:19:44.643 TaskSetManager: INFO: Finished task 55.0 in stage 87.0 (TID 587) in 33 ms on uger-c010.broadinstitute.org (executor driver) (56/56) 2023-04-22 21:19:44.643 TaskSchedulerImpl: INFO: Removed TaskSet 87.0, whose tasks have all completed, from pool 2023-04-22 21:19:44.643 DAGScheduler: INFO: ResultStage 87 (collect at ContextRDD.scala:176) finished in 23.083 s 2023-04-22 21:19:44.643 DAGScheduler: INFO: Job 48 is finished. Cancelling potential speculative or zombie tasks for this job 2023-04-22 21:19:44.643 TaskSchedulerImpl: INFO: Killing all running tasks in stage 87: Stage finished 2023-04-22 21:19:44.644 DAGScheduler: INFO: Job 48 finished: collect at ContextRDD.scala:176, took 23.087760 s 2023-04-22 21:19:44.669 Hail: INFO: wrote matrix with 114591 rows and 4151 columns as 56 blocks of size 4096 to /fg/saxenalab/data/GnomAD/GnomAD_HGDP-1KG/PC_Relate/HG38/pcrelate-write-read-BkWnS8a1LMZurvEXJjGAoA.bm 2023-04-22 21:19:44.847 SparkContext: INFO: Starting job: collect at ContextRDD.scala:176 2023-04-22 21:19:44.847 DAGScheduler: INFO: Got job 49 (collect at ContextRDD.scala:176) with 4 output partitions 2023-04-22 21:19:44.847 DAGScheduler: INFO: Final stage: ResultStage 88 (collect at ContextRDD.scala:176) 2023-04-22 21:19:44.847 DAGScheduler: INFO: Parents of final stage: List() 2023-04-22 21:19:44.848 DAGScheduler: INFO: Missing parents: List() 2023-04-22 21:19:44.848 DAGScheduler: INFO: Submitting ResultStage 88 (MapPartitionsRDD[214] at mapPartitions at ContextRDD.scala:168), which has no missing parents 2023-04-22 21:19:44.862 MemoryStore: INFO: Block broadcast_209 stored as values in memory (estimated size 119.2 KiB, free 25.2 GiB) 2023-04-22 21:19:44.864 MemoryStore: INFO: Block broadcast_209_piece0 stored as bytes in memory (estimated size 46.9 KiB, free 25.2 GiB) 2023-04-22 21:19:44.864 BlockManagerInfo: INFO: Added broadcast_209_piece0 in memory on uger-c010.broadinstitute.org:46121 (size: 46.9 KiB, free: 25.3 GiB) 2023-04-22 21:19:44.877 SparkContext: INFO: Created broadcast 209 from broadcast at DAGScheduler.scala:1513 2023-04-22 21:19:44.877 DAGScheduler: INFO: Submitting 4 missing tasks from ResultStage 88 (MapPartitionsRDD[214] at mapPartitions at ContextRDD.scala:168) (first 15 tasks are for partitions Vector(0, 1, 2, 3)) 2023-04-22 21:19:44.877 TaskSchedulerImpl: INFO: Adding task set 88.0 with 4 tasks resource profile 0 2023-04-22 21:19:44.878 TaskSetManager: INFO: Starting task 0.0 in stage 88.0 (TID 588) (uger-c010.broadinstitute.org, executor driver, partition 0, PROCESS_LOCAL, 4265 bytes) taskResourceAssignments Map() 2023-04-22 21:19:44.878 Executor: INFO: Running task 0.0 in stage 88.0 (TID 588) 2023-04-22 21:19:44.920 BlockManager: INFO: Found block rdd_188_0 locally 2023-04-22 21:19:44.920 BlockManager: INFO: Found block rdd_188_0 locally 2023-04-22 21:19:45.058 BlockManagerInfo: INFO: Removed broadcast_208_piece0 on uger-c010.broadinstitute.org:46121 in memory (size: 40.9 KiB, free: 25.3 GiB) 2023-04-22 21:19:45.606 MemoryStore: INFO: Block rdd_207_0 stored as values in memory (estimated size 128.0 MiB, free 25.1 GiB) 2023-04-22 21:19:45.606 BlockManagerInfo: INFO: Added rdd_207_0 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 25.1 GiB) 2023-04-22 21:19:45.606 BlockManager: INFO: Found block rdd_207_0 locally 2023-04-22 21:21:24.136 BlockManager: INFO: Found block rdd_188_1 locally 2023-04-22 21:21:24.136 BlockManager: INFO: Found block rdd_188_1 locally 2023-04-22 21:21:24.887 MemoryStore: INFO: Block rdd_207_1 stored as values in memory (estimated size 128.0 MiB, free 24.9 GiB) 2023-04-22 21:21:24.887 BlockManagerInfo: INFO: Added rdd_207_1 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 25.0 GiB) 2023-04-22 21:21:24.887 BlockManager: INFO: Found block rdd_207_1 locally 2023-04-22 21:24:46.899 BlockManager: INFO: Found block rdd_188_2 locally 2023-04-22 21:24:46.899 BlockManager: INFO: Found block rdd_188_2 locally 2023-04-22 21:24:48.001 MemoryStore: INFO: Block rdd_207_2 stored as values in memory (estimated size 128.0 MiB, free 24.8 GiB) 2023-04-22 21:24:48.002 BlockManagerInfo: INFO: Added rdd_207_2 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 24.9 GiB) 2023-04-22 21:24:48.002 BlockManager: INFO: Found block rdd_207_2 locally 2023-04-22 21:28:10.954 HeartbeatReceiver: WARN: Removing executor driver with no recent heartbeats: 204062 ms exceeds timeout 120000 ms 2023-04-22 21:28:10.969 BlockManager: INFO: Found block rdd_188_3 locally 2023-04-22 21:28:10.969 BlockManager: INFO: Found block rdd_188_3 locally 2023-04-22 21:28:10.970 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:10.971 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:10.971 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:10.973 SparkContext: WARN: Killing executors is not supported by current scheduler. 2023-04-22 21:28:10.982 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:10.983 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:10.999 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:10.999 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:10.999 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.006 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.006 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.008 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.008 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.008 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.020 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.020 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.022 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.022 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.022 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.025 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.025 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.027 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.027 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.027 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.029 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.030 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.031 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.031 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.031 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.032 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.032 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.046 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.046 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.046 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.049 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.057 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.058 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.058 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.058 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.058 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.059 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.060 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.060 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.060 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.060 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.060 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.061 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.061 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.061 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.062 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.062 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.063 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.063 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.063 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.068 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.068 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.069 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.069 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.069 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.070 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.070 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.071 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.071 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.071 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.071 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.071 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.072 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.073 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.073 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.073 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.073 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.091 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.091 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.091 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.102 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.102 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.103 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.103 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.103 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.104 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.104 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.105 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.105 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.105 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.105 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.106 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.107 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.107 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.107 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.108 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.109 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.110 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.110 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.110 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.110 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.110 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.111 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:11.111 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:11.111 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:11.113 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.114 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:28:11.729 MemoryStore: INFO: Block rdd_207_3 stored as values in memory (estimated size 128.0 MiB, free 24.7 GiB) 2023-04-22 21:28:11.729 BlockManagerInfo: INFO: Added rdd_207_3 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 24.8 GiB) 2023-04-22 21:28:11.730 BlockManager: INFO: Found block rdd_207_3 locally 2023-04-22 21:28:14.110 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:28:14.110 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:28:14.110 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:28:14.111 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.494 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.508 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.508 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.508 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.509 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.509 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.512 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.512 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.512 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.512 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.513 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.514 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.514 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.514 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.514 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.515 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.520 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.520 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.520 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.520 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.521 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.521 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.521 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.521 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.522 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.522 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.523 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.523 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.523 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.523 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.524 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.524 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.524 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.524 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.525 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.525 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.528 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.528 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.528 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.529 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.529 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.530 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.530 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.530 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.530 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.531 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.532 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.532 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.532 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.532 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.532 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.533 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.533 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.533 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.533 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.534 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.545 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.545 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.545 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.546 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.546 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.547 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.547 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.547 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.547 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.548 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.548 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.548 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.548 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.549 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.549 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.561 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.561 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.561 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.561 BlockManager: INFO: Found block rdd_188_4 locally 2023-04-22 21:31:29.561 BlockManager: INFO: Found block rdd_188_4 locally 2023-04-22 21:31:29.587 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.606 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.607 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.607 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.607 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.607 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.608 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.608 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.608 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.608 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.609 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.609 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.610 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.610 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.610 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.610 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.611 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.611 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:29.611 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:29.611 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:29.612 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:29.612 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:31:30.417 MemoryStore: INFO: Block rdd_207_4 stored as values in memory (estimated size 128.0 MiB, free 24.6 GiB) 2023-04-22 21:31:30.417 BlockManagerInfo: INFO: Added rdd_207_4 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 24.6 GiB) 2023-04-22 21:31:30.418 BlockManager: INFO: Found block rdd_207_4 locally 2023-04-22 21:31:34.110 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:31:34.110 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:31:34.110 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:31:34.111 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.924 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.928 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.928 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.928 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.928 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.929 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.930 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.930 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.930 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.930 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.933 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.935 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.935 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.935 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.935 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.936 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.936 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.936 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.936 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.937 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.937 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.947 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.947 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.947 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.947 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.948 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.948 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.948 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.948 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.949 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.949 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.950 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.950 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.950 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.950 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.951 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.951 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.951 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.951 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.952 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.952 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.953 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.953 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.953 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.953 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.954 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.954 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.954 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.954 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.954 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.955 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.956 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.956 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.956 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.956 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.957 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.957 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.957 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.957 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.957 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.958 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.959 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.959 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.959 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.959 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.959 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.960 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.960 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.960 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.960 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.961 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.962 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.962 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.962 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.962 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.962 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.963 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.963 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.963 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.963 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.964 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.973 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.973 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.973 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.974 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.974 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.975 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.975 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.975 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.975 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.976 BlockManager: INFO: Found block rdd_188_5 locally 2023-04-22 21:34:48.976 BlockManager: INFO: Found block rdd_188_5 locally 2023-04-22 21:34:48.976 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.977 Executor: INFO: Told to re-register on heartbeat 2023-04-22 21:34:48.977 BlockManager: INFO: BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) re-registering with master 2023-04-22 21:34:48.977 BlockManagerMaster: INFO: Registering BlockManager BlockManagerId(driver, uger-c010.broadinstitute.org, 46121, None) 2023-04-22 21:34:48.977 Inbox: ERROR: Ignoring error org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.978 Executor: WARN: Issue communicating with driver in heartbeater org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEndpointRef.askSync(RpcEndpointRef.scala:87) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:80) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManager.reregister(BlockManager.scala:643) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.reportHeartBeat(Executor.scala:1057) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$heartbeater$1(Executor.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.Heartbeater$$anon$1.run(Heartbeater.scala:46) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180) ~[?:1.8.0_345] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] Caused by: org.apache.spark.SparkException: Exception thrown in awaitResult: at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:301) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:102) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:110) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:36) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.driverEndpoint$lzycompute(BlockManagerMasterEndpoint.scala:117) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$driverEndpoint(BlockManagerMasterEndpoint.scala:116) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$lzycompute$1(BlockManagerMasterEndpoint.scala:593) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.isExecutorAlive$1(BlockManagerMasterEndpoint.scala:592) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint.org$apache$spark$storage$BlockManagerMasterEndpoint$$register(BlockManagerMasterEndpoint.scala:630) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.storage.BlockManagerMasterEndpoint$$anonfun$receiveAndReply$1.applyOrElse(BlockManagerMasterEndpoint.scala:123) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] ... 3 more Caused by: org.apache.spark.rpc.RpcEndpointNotFoundException: Cannot find endpoint: spark://CoarseGrainedScheduler@uger-c010.broadinstitute.org:33543 at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1(NettyRpcEnv.scala:148) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$asyncSetupEndpointRefByURI$1$adapted(NettyRpcEnv.scala:144) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.Future.$anonfun$flatMap$1(Future.scala:307) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transformWith$1(Promise.scala:41) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.processBatch$1(BatchingExecutor.scala:67) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.$anonfun$run$1(BatchingExecutor.scala:82) ~[scala-library-2.12.15.jar:?] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:85) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:59) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:875) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute(BatchingExecutor.scala:110) ~[scala-library-2.12.15.jar:?] at scala.concurrent.BatchingExecutor.execute$(BatchingExecutor.scala:107) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:873) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.trySuccess$(Promise.scala:94) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.NettyRpcEnv.onSuccess$1(NettyRpcEnv.scala:225) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5(NettyRpcEnv.scala:239) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$askAbortable$5$adapted(NettyRpcEnv.scala:238) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.ThreadUtils$$anon$1.execute(ThreadUtils.scala:99) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.concurrent.impl.ExecutionContextImpl$$anon$4.execute(ExecutionContextImpl.scala:138) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:72) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:288) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete(Promise.scala:53) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.complete$(Promise.scala:52) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.Promise.success$(Promise.scala:86) ~[scala-library-2.12.15.jar:?] at scala.concurrent.impl.Promise$DefaultPromise.success(Promise.scala:187) ~[scala-library-2.12.15.jar:?] at org.apache.spark.rpc.netty.LocalNettyRpcCallContext.send(NettyRpcCallContext.scala:50) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.NettyRpcCallContext.reply(NettyRpcCallContext.scala:32) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.RpcEndpointVerifier$$anonfun$receiveAndReply$1.applyOrElse(RpcEndpointVerifier.scala:31) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.$anonfun$process$1(Inbox.scala:103) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.safelyCall(Inbox.scala:213) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.Inbox.process(Inbox.scala:100) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop.org$apache$spark$rpc$netty$MessageLoop$$receiveLoop(MessageLoop.scala:75) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.rpc.netty.MessageLoop$$anon$1.run(MessageLoop.scala:41) ~[spark-core_2.12-3.3.2.jar:3.3.2] ... 3 more 2023-04-22 21:34:48.979 Executor: ERROR: Exit as unable to send heartbeats to driver more than 60 times 2023-04-22 21:34:50.070 MemoryStore: INFO: Block rdd_207_5 stored as values in memory (estimated size 128.0 MiB, free 24.4 GiB) 2023-04-22 21:34:50.071 BlockManagerInfo: INFO: Added rdd_207_5 in memory on uger-c010.broadinstitute.org:46121 (size: 128.0 MiB, free: 24.5 GiB) 2023-04-22 21:34:50.071 BlockManager: INFO: Found block rdd_207_5 locally 2023-04-22 21:38:08.970 Utils: ERROR: Uncaught exception in thread shutdown-hook-0 java.lang.InterruptedException: null at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014) ~[?:1.8.0_345] at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2088) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.awaitTermination(ThreadPoolExecutor.java:1475) ~[?:1.8.0_345] at org.apache.spark.Heartbeater.stop(Heartbeater.scala:54) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.stop(Executor.scala:365) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.executor.Executor.$anonfun$stopHookReference$1(Executor.scala:76) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:2066) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188) ~[spark-core_2.12-3.3.2.jar:3.3.2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.15.jar:?] at scala.util.Try$.apply(Try.scala:213) ~[scala-library-2.12.15.jar:?] at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188) ~[spark-core_2.12-3.3.2.jar:3.3.2] at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178) ~[spark-core_2.12-3.3.2.jar:3.3.2] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[?:1.8.0_345] at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_345] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_345] at java.lang.Thread.run(Thread.java:750) ~[?:1.8.0_345] 2023-04-22 21:38:08.986 BlockManager: INFO: Found block rdd_188_6 locally 2023-04-22 21:38:08.986 BlockManager: INFO: Found block rdd_188_6 locally