I am running
mt.write(self.output().path, stage_locally=True, overwrite=True)
line 63 here: https://github.com/macarthur-lab/hail-elasticsearch-pipelines/blob/master/luigi_pipeline/seqr_loading.py
And getting the error:
hail.utils.java.FatalError: HailException: array index out of bounds: 1 / 1. IR: (ArrayRef
(Ref __iruid_5324)
(If
(ApplyComparisonOp LT
(Ref __iruid_5325)
(I32 0 …Java stack trace:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 16 in stage 7.0 failed 4 times, most recent failure: Lost task 16.3 in stage 7.0 (TID 3028, 137.187.60.61, executor 2): is.hail.utils.HailException: array index out of bounds: 1 / 1. IR: (ArrayRef
(Ref __iruid_5324)
(If
(ApplyComparisonOp LT
(Ref __iruid_5325)
(I32 0 …
at is.hail.codegen.generated.C51.method_4(Unknown Source)
at is.hail.codegen.generated.C51.method_3(Unknown Source)
at is.hail.codegen.generated.C51.method_2(Unknown Source)
at is.hail.codegen.generated.C51.apply(Unknown Source)
at is.hail.codegen.generated.C51.apply(Unknown Source)
at is.hail.expr.ir.TableMapRows$$anonfun$55$$anonfun$apply$19.apply(TableIR.scala:1123)
at is.hail.expr.ir.TableMapRows$$anonfun$55$$anonfun$apply$19.apply(TableIR.scala:1122)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at scala.collection.Iterator$$anon$12.next(Iterator.scala:445)
at is.hail.io.RichContextRDDRegionValue$$anonfun$boundary$extension$1$$anon$1.next(RichContextRDDRegionValue.scala:193)
It seems similar to the error:
It happens with some of our old .vcf that we ran using Hail 0.1 successfully a year ago, but works perfectly with new ones. Old one is VCF4.1, but new one is VCF4.2.
Do you know what could be done to be able to run it still?