Initialise Hail with existing Spark

Hi @ireneisdoomed !

Are the Hail JARs in the class path of your Spark cluster? There are some details on using arbitrary Spark clusters in the docs.

For example, if you’re using spark-submit, you need to specify the JAR paths:

HAIL_HOME=$(pip3 show hail | grep Location | awk -F' ' '{print $2 "/hail"}')
spark-submit \
  --jars $HAIL_HOME/hail-all-spark.jar \
  --conf spark.driver.extraClassPath=$HAIL_HOME/hail-all-spark.jar \
  --conf spark.executor.extraClassPath=./hail-all-spark.jar \
  --conf spark.serializer=org.apache.spark.serializer.KryoSerializer \
  --conf spark.kryo.registrator=is.hail.kryo.HailKryoRegistrator \
  hail-script.py