Python error when using HailContext

So I’m on CDH 5.10 which has Spark 2.0.0 as a side parcel install, and I’m simply walking through the “Getting Started” tutorial here:

Everything below is fine, but I hit an error in the python/hail shell when using the HailContext. Any ideas about what may be going on ?

acknowledgements.txt  build      derby.log  gradle   gradlew.bat  python     settings.gradle  www
AUTHORS               build.gradle  code_style.xml  docs       gradlew  LICENSE  src              testng.xml
bin  etc  lib  meta
$ alias hail="PYTHONPATH=$SPARK_HOME/lib/spark2/python:$SPARK_HOME/lib/spark2/python/lib/$HAIL_HOME/python SPARK_CLASSPATH=$HAIL_HOME/build/libs/hail-all-spark.jar python"
$ hail
Python 2.7.5 (default, Sep 15 2016, 22:37:39)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-4)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import hail
>>> hc = hail.HailContext()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/keebler/hail/python/hail/", line 37, in __init__
  File "/opt/cloudera/parcels/SPARK2/lib/spark2/python/pyspark/", line 251, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway(conf)
  File "/opt/cloudera/parcels/SPARK2/lib/spark2/python/pyspark/", line 85, in launch_gateway
    proc = Popen(command, stdin=PIPE, preexec_fn=preexec_func, env=env)
  File "/usr/lib64/python2.7/", line 711, in __init__
    errread, errwrite)
  File "/usr/lib64/python2.7/", line 1327, in _execute_child
    raise child_exception
OSError: [Errno 2] No such file or directory

This is fixed in CDH 5.10 using Spark2 parcel by setting:

export SPARK_HOME=/opt/cloudera/parcels/SPARK2/lib/spark2

alias hail="PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/$HAIL_HOME/python SPARK_CLASSPATH=$HAIL_HOME/build/libs/hail-all-spark.jar python"