Spark version problem


When I tried to use hail 0.2, it says:

Py4JJavaError: An error occurred while calling z:is.hail.HailContext.apply.
: is.hail.utils.HailException: This Hail JAR was compiled for Spark 2.2.0, cannot run with Spark 2.4.0.

But says " For all methods other than using pip , you will additionally need Spark 2.4.x."

When I change the spark version to 2.2.0, it shows the following message in the end:

Java gateway process exited before sending its port number

What should I do?


Ah, this is sort of a docs versioning problem. Last week we switched from building against 2.2.0 to 2.4.0, but haven’t deployed a new version to PyPI that uses 2.4.0.

How did you install Hail?

Hi, Tpoterba.

After days of trying, I finally successfully installed and initiated hail. But one more error occured…

ERROR:py4j.java_gateway:An error occurred while trying to connect to the Java server (


Did you install hail using pip or did you compile a JAR yourself?

Can you run the following and copy paste the entire output to here?

python -m pip show hail
python -m pip show pyspark
which pyspark
which python
which pip
python -c 'import hail as hl; hl.balding_nichols_model(3, 100, 100)._force_count_rows()'

Then run this:

ls -rt hail*.log | tail -n 1

That should print a filename. Can you please attach that file here?

Finally run this:

ls -lrt *hs_err_pid* | tail -n 1

If that outputs a filename, can you please attach that file here?

With that information, we will be able to diagnose your issue.