Can't create HailContext in Hail 0.1 tutorial

Hi,

I am new to both Hail and Spark. What I want is to have some local test environment to load Hail key table or variant dataset (Hail 0.1) and play with them. I followed the tutorial closely:

https://hail.is/docs/0.1/getting_started.html

And used the command to launch the interactive environment:

SPARK_HOME=/path_to_spark HAIL_HOME=/path_to_hail PYTHONPATH="$PYTHONPATH:$HAIL_HOME/build/distributions/hail-python.zip:$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.3-src.zip" ipython

I verified that I can locate the py4j-0.10.3-src.zip, $SPARK_HOME/python, $HAIL_HOME/build/distributions/hail-python.zip files but the command gives the error:

TypeError: ‘JavaPackage’ object is not callable

When I try to run hc = HailContext().

Spark that I use is spark-2.0.2-bin-hadoop2.7
openjdk version 1.8.0_212

Hi, Hail 0.1 is deprecated and unsupported. Can you switch to 0.2? You can pip install hail to get going with 0.2.

I need to use the Elasticsearch pipeline:

and it is written only in 0.1. There is 0.2 version started but it is far from being finished from what I see. When the pipeline is ready, then I could start moving to 0.2.

My guess is you forgot the SPARK_CLASSPATH= bit, but this is the last help we can give for 0.1 – we don’t have the resources to support a deprecated version of Hail, and really nobody on the development team remembers how to use 0.1 anyhow :slight_smile:

It may be worth asking the seqr team when they expect to have 0.2 versions ready.

Ok, got it. Thank you!

I tried it and it worked well, so the unset SPARK_CLASSPATH was the main issue