TypeError: 'JavaPackage' object is not callable


Everything seems to be working except the python. Have you ever seen this error? Clearly I have missed something.

$ python
Python 2.7.13 (default, Jul 7 2017, 13:50:00)
[GCC 5.3.1 20160406 (Red Hat 5.3.1-6)] on linux2
Type “help”, “copyright”, “credits” or “license” for more information.

import hail
hc = hail.HailContext()
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Traceback (most recent call last):
File “”, line 1, in
File “”, line 2, in init
File “hail/typecheck/check.py”, line 202, in _typecheck
return f(*args, **kwargs)
File “hail/context.py”, line 83, in init
parquet_compression, min_block_size, branching_factor, tmp_dir)
TypeError: ‘JavaPackage’ object is not callable



Hi Kevin,
This usually means that the jar file isn’t set correctly. The way to pass it in is with the SPARK_CLASSPATH environment variable. What’s yours set to?


Nothing and that would be the problem.


Set it to SPARK_CLASSPATH=$HAIL_HOME/build/libs/hail-all-spark.jar and now it works.


Fantastic! I’ll change the title of the post to make it more searchable for the “JavaPackage object is not callable” error.


NB: Beryl and I just noticed this also seems to happen if you use Spark 2.2.0 - make sure you’re using 2.0.2 or 2.1.0 (at least until Hail is updated to use 2.2).


Thanks for the observation.
First I encountered problems caused by Python3
and then I was trying to deal with this :frowning:
thanks for saving the time (Y)


I encountered this problem with Python3 / Spark 2.2.0 and the reason is that the SPARK_CLASSPATH environment variable doesn’t work with Spark 2.2.0 any more.
Thanks to Tim for giving me the solution: setting the following environment variable let me put the Hail jar in Spark path:
export PYSPARK_SUBMIT_ARGS="--conf spark.driver.extraClassPath=$JAR_PATH --conf spark.executor.extraClassPath=$JAR_PATH pyspark-shell"