Hi,
I am new to both Hail and Spark. What I want is to have some local test environment to load Hail key table or variant dataset (Hail 0.1) and play with them. I followed the tutorial closely:
https://hail.is/docs/0.1/getting_started.html
And used the command to launch the interactive environment:
SPARK_HOME=/path_to_spark HAIL_HOME=/path_to_hail PYTHONPATH="$PYTHONPATH:$HAIL_HOME/build/distributions/hail-python.zip:$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.3-src.zip" ipython
I verified that I can locate the py4j-0.10.3-src.zip, $SPARK_HOME/python, $HAIL_HOME/build/distributions/hail-python.zip files but the command gives the error:
TypeError: ‘JavaPackage’ object is not callable
When I try to run hc = HailContext()
.
Spark
that I use is spark-2.0.2-bin-hadoop2.7
openjdk
version 1.8.0_212