TypeError: 'JavaPackage' object is not callable

1 mt= hl.balding_nichols_model(n_populations=3, n_samples=50, n_variants=100)
----> 2 mt.count()

This shouldn’t OOM. The log also never saw the IR come in

It has 503G total. I will try your suggestion.

Ok some of the error output changed below is what I get from the notebook side

2019-08-09 11:29:00 Hail: INFO: balding_nichols_model: generating genotypes for 3 populations, 50 samples, and 100 variants…
ERROR:root:Exception while sending command.
Traceback (most recent call last):
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 1159, in send_command
raise Py4JNetworkError(“Answer from Java side is empty”)
py4j.protocol.Py4JNetworkError: Answer from Java side is empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 985, in send_command
response = connection.send_command(command)
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 1164, in send_command
“Error while receiving”, e, proto.ERROR_ON_RECEIVE)
py4j.protocol.Py4JNetworkError: Error while receiving

Py4JError Traceback (most recent call last)
in
1 mt= hl.balding_nichols_model(n_populations=3, n_samples=50, n_variants=100)
----> 2 mt.count()

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/matrixtable.py in count(self)
2418 Number of rows, number of cols.
2419 “”"
→ 2420 return (self.count_rows(), self.count_cols())
2421
2422 @typecheck_method(output=str,

</home/him/miniconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1127> in count_rows(self, _localize)

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py in wrapper(__original_func, *args, **kwargs)
583 def wrapper(original_func, *args, **kwargs):
584 args
, kwargs
= check_all(__original_func, args, kwargs, checkers, is_method=is_method)
→ 585 return original_func(*args, **kwargs)
586
587 return wrapper

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/matrixtable.py in count_rows(self, _localize)
2373 ir = TableCount(MatrixRowsTable(self._mir))
2374 if _localize:
→ 2375 return Env.backend().execute(ir)
2376 else:
2377 return construct_expr(ir, hl.tint64)

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/backend/backend.py in execute(self, ir, timed)
106
107 def execute(self, ir, timed=False):
→ 108 result = json.loads(Env.hc()._jhc.backend().executeJSON(self._to_java_ir(ir)))
109 value = ir.typ._from_json(result[‘value’])
110 timings = result[‘timings’]

~/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py in call(self, *args)
1255 answer = self.gateway_client.send_command(command)
1256 return_value = get_return_value(
→ 1257 answer, self.gateway_client, self.target_id, self.name)
1258
1259 for temp_arg in temp_args:

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/utils/java.py in deco(*args, **kwargs)
207 import pyspark
208 try:
→ 209 return f(*args, **kwargs)
210 except py4j.protocol.Py4JJavaError as e:
211 s = e.java_exception.toString()

~/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)
334 raise Py4JError(
335 “An error occurred while calling {0}{1}{2}”.
→ 336 format(target_id, “.”, name))
337 else:
338 type = answer[1]

Py4JError: An error occurred while calling o51.executeJSON

below is what I see on the terminal

[I 11:28:44.633 NotebookApp] Adapting from protocol version 5.1 (kernel 154409c5-822d-4413-aadc-d5ba0c8be324) to 5.3 (client).
2019-08-09 11:28:56 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
ERROR: dlopen(“/tmp/libhail6261548975316297961.so”): /lib64/libstdc++.so.6: version CXXABI_1.3.8' not found (required by /tmp/libhail6261548975316297961.so) FATAL: caught exception java.lang.UnsatisfiedLinkError: /tmp/libhail6261548975316297961.so: /lib64/libstdc++.so.6: version CXXABI_1.3.8’ not found (required by /tmp/libhail6261548975316297961.so)
java.lang.UnsatisfiedLinkError: /tmp/libhail6261548975316297961.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.8’ not found (required by /tmp/libhail6261548975316297961.so)
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at is.hail.nativecode.NativeCode.(NativeCode.java:30)
at is.hail.nativecode.NativeBase.(NativeBase.scala:20)
at is.hail.annotations.Region.(Region.scala:175)
at is.hail.annotations.Region$.apply(Region.scala:16)
at is.hail.annotations.Region$.scoped(Region.scala:18)
at is.hail.expr.ir.ExecuteContext$.scoped(ExecuteContext.scala:7)
at is.hail.backend.Backend.execute(Backend.scala:86)
at is.hail.backend.Backend.executeJSON(Backend.scala:92)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:745)

Trying out your settting did cause some changes. It displays that it’s using 4.1gb of memory. But did not solve the problem. Below is the hail.log file output. I am only posting the ending part where it might be relevant. Any other suggestions? Thank you.

2019-08-09 11:28:59 Hail: INFO: Running Hail version 0.2.19-c6ec8b76eb26
2019-08-09 11:28:59 MemoryStore: INFO: Block broadcast_0 stored as values in memory (estimated size 33.5 KB, free 4.1 GB)
2019-08-09 11:28:59 MemoryStore: INFO: Block broadcast_0_piece0 stored as bytes in memory (estimated size 2.9 KB, free 4.1 GB)
2019-08-09 11:28:59 BlockManagerInfo: INFO: Added broadcast_0_piece0 in memory on test:35006 (size: 2.9 KB, free: 4.1 GB)
2019-08-09 11:28:59 SparkContext: INFO: Created broadcast 0 from broadcast at SparkBackend.scala:26
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_1 stored as values in memory (estimated size 1643.2 KB, free 4.1 GB)
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_1_piece0 stored as bytes in memory (estimated size 108.4 KB, free 4.1 GB)
2019-08-09 11:29:00 BlockManagerInfo: INFO: Added broadcast_1_piece0 in memory on test:35006 (size: 108.4 KB, free: 4.1 GB)
2019-08-09 11:29:00 SparkContext: INFO: Created broadcast 1 from broadcast at SparkBackend.scala:26
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_2 stored as values in memory (estimated size 8.7 KB, free 4.1 GB)
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1000.0 B, free 4.1 GB)
2019-08-09 11:29:00 BlockManagerInfo: INFO: Added broadcast_2_piece0 in memory on test:35006 (size: 1000.0 B, free: 4.1 GB)
2019-08-09 11:29:00 SparkContext: INFO: Created broadcast 2 from broadcast at SparkBackend.scala:26
2019-08-09 11:29:00 SparkContext: WARN: Using an existing SparkContext; some configuration may not take effect.
2019-08-09 11:29:00 Hail: INFO: balding_nichols_model: generating genotypes for 3 populations, 50 samples, and 100 variants…
2019-08-09 11:29:01 SparkContext: INFO: Invoking stop() from shutdown hook
2019-08-09 11:29:01 AbstractConnector: INFO: Stopped Spark@1d783d20{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-08-09 11:29:01 SparkUI: INFO: Stopped Spark web UI at http://test:4040
2019-08-09 11:29:01 MapOutputTrackerMasterEndpoint: INFO: MapOutputTrackerMasterEndpoint stopped!
2019-08-09 11:29:01 MemoryStore: INFO: MemoryStore cleared
2019-08-09 11:29:01 BlockManager: INFO: BlockManager stopped
2019-08-09 11:29:01 BlockManagerMaster: INFO: BlockManagerMaster stopped
2019-08-09 11:29:01 OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: INFO: OutputCommitCoordinator stopped!
2019-08-09 11:29:01 SparkContext: INFO: Successfully stopped SparkContext
2019-08-09 11:29:01 ShutdownHookManager: INFO: Shutdown hook called
2019-08-09 11:29:01 ShutdownHookManager: INFO: Deleting directory /tmp/spark-1bd29d38-6dd0-4562-aae1-2b598144c213
2019-08-09 11:29:01 ShutdownHookManager: INFO: Deleting directory /tmp/spark-332bfc01-8301-4c33-af4c-d1d36e1ac9a0

Hi all! I am having the same issue.

My configuration follows:

conda activate hail
export SPARK_HOME=/Users/chernandez/Software/spark-2.4.4-bin-hadoop2.7
export PATH=/Users/chernandez/Software/spark-2.4.4-bin-hadoop2.7/bin:$PATH
export HAIL_HOME=$(pip show hail | grep Location | awk -F' ' '{print $2 "/hail"}')
export PYTHONPATH="$HAIL_HOME:$SPARK_HOME/python:`echo $SPARK_HOME/python/lib/py4j*-src.zip`:$PYTHONPATH"
export SPARK_CLASSPATH=$HAIL_HOME/hail-all-spark.jar

The error is raised on init:

import hail as hl
main_conf = # ... #
spark_conf = SparkConf().setAppName(APP_NAME).set('spark.executor.cores', cores)
spark = SparkSession.builder.config(conf=spark_conf).getOrCreate()
spark.sparkContext._jsc.hadoopConfiguration().setInt("dfs.block.size", main_conf["dfs_block_size"])
spark.sparkContext._jsc.hadoopConfiguration().setInt("parquet.block.size", main_conf["dfs_block_size"])
hl.init(spark.sparkContext)

The full trace of the error is:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "</Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1078>", line 2, in init
  File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py", line 585, in wrapper
    return __original_func(*args_, **kwargs_)
   File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/context.py", line 264, in init
_optimizer_iterations,_backend)
   File "</Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1076>", line 2, in __init__
  File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py", line 585, in wrapper
return __original_func(*args_, **kwargs_)
  File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/context.py", line 99, in __init__
    min_block_size, branching_factor, tmp_dir, optimizer_iterations)
TypeError: 'JavaPackage' object is not callable

My main headache is that the “same” set-up is working fine in Ubuntu but not in Mac OS :cry:

For MacOS we’d definitely recommend the pip installation route – much easier.

I installed hail using pip as indicated in the documentation (aka. https://hail.is/docs/0.2/getting_started.html).

can you try removing this bit and running again to see if you get the same error?

@carleshf Pip hail does not work with already created SparkContext. Can you use PYSPARK_SUBMIT_ARGS to specify your spark options and start hail as:

import hail as hl
hl.init()

Hi,
I am experiencing the same issue.
I am trying to run hail in cluster mode. I built from source code against spark 2.3.2.

/usr/bin/spark-submit --version
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.3.2.3.1.0.0-78
      /_/

Using Scala version 2.11.8, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_112
Branch HEAD
Compiled by user jenkins on 2018-12-06T12:26:34Z
Revision 9b78096afddf26e2d73f0c078a112c9bf979ed53
Url git@github.com:hortonworks/spark2.git
Type --help for more information.

This is how I built hail
sudo make install-on-cluster HAIL_COMPILE_NATIVES=1 SPARK_VERSION=2.3.2

This is how I installed hail in my python environment
sudo /share/ClusterShare/anaconda3/envs/python37/bin/pip install /share/apps/luffy/hail/hail/build/deploy

This is the testing code

import hail as hl
hl.init()

And this is the error I am getting

Fail to execute line 3: hl.init()
Traceback (most recent call last):
  File "/d1/hadoop/yarn/local/usercache/mansop/appcache/application_1572410115474_0103/container_e16_1572410115474_0103_01_000001/tmp/zeppelin_pyspark-2391960415257049015.py", line 380, in <module>
    exec(code, _zcUserQueryNameSpace)
  File "<stdin>", line 3, in <module>
  File "</share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/decorator.py:decorator-gen-1108>", line 2, in init
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/typecheck/check.py", line 585, in wrapper
    return __original_func(*args_, **kwargs_)
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/context.py", line 280, in init
    _optimizer_iterations,_backend)
  File "</share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/decorator.py:decorator-gen-1106>", line 2, in __init__
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/typecheck/check.py", line 585, in wrapper
    return __original_func(*args_, **kwargs_)
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/context.py", line 115, in __init__
    min_block_size, branching_factor, tmp_dir, optimizer_iterations)
TypeError: 'JavaPackage' object is not callable

I am stuck because I don’t understand the meaning of this error or how to keep troubleshooting.

Could someone please share some thoughts?

NOTE: I am using Zeppelin web notebook to run the test code

thank you very much

This is how I built hail
sudo make install-on-cluster HAIL_COMPILE_NATIVES=1 SPARK_VERSION=2.3.2

This is how I installed hail in my python environment
sudo /share/ClusterShare/anaconda3/envs/python37/bin/pip install /share/apps/luffy/hail/hail/build/deploy

You shouldn’t need an additional pip install – the install-on-cluster target includes a pip install:

.PHONY: install-on-cluster
install-on-cluster: $(WHEEL)
	sed '/^pyspark/d' python/requirements.txt | xargs $(PIP) install -U
	-$(PIP) uninstall -y hail
	$(PIP) install $(WHEEL) --no-deps

However, if it’s picking up the wrong pip that could be why. Try that make target again including the following env variable:

HAIL_PYTHON3="/share/ClusterShare/anaconda3/envs/python37/bin/python3"

Hi,

as suggested I uninstalled hail through pip, then deleted the local git repo and downloaded it again. Then I installed using the command below:

$ sudo make install-on-cluster HAIL_COMPILE_NATIVES=1 SPARK_VERSION=2.3.2 HAIL_PYTHON3="/share/ClusterShare/anaconda3/envs/python37/bin/python3"
...
removing build/bdist.linux-x86_64/wheel
sed '/^pyspark/d' python/requirements.txt | xargs /share/ClusterShare/anaconda3/envs/python37/bin/python3 -m pip install -U
Requirement already up-to-date: aiohttp<3.7,>=3.6 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (3.6.2)
Requirement already up-to-date: aiohttp_session<2.8,>=2.7 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (2.7.0)
Requirement already up-to-date: asyncinit<0.3,>=0.2.4 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.2.4)
Requirement already up-to-date: bokeh<1.3,>1.1 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (1.2.0)
Requirement already up-to-date: decorator<5 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (4.4.1)
Requirement already up-to-date: gcsfs==0.2.1 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.2.1)
Requirement already up-to-date: hurry.filesize==0.9 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.9)
Requirement already up-to-date: nest_asyncio in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (1.2.0)
Requirement already up-to-date: numpy<2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (1.17.4)
Requirement already up-to-date: pandas<0.26,>0.24 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.25.3)
Requirement already up-to-date: parsimonious<0.9 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.8.1)
Requirement already up-to-date: PyJWT in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (1.7.1)
Requirement already up-to-date: python-json-logger==0.1.11 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.1.11)
Requirement already up-to-date: requests<2.21.1,>=2.21.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (2.21.0)
Requirement already up-to-date: scipy<1.4,>1.2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (1.3.2)
Requirement already up-to-date: tabulate==0.8.3 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.8.3)
Requirement already satisfied, skipping upgrade: async-timeout<4.0,>=3.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from aiohttp<3.7,>=3.6) (3.0.1)
Requirement already satisfied, skipping upgrade: chardet<4.0,>=2.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from aiohttp<3.7,>=3.6) (3.0.4)
Requirement already satisfied, skipping upgrade: yarl<2.0,>=1.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from aiohttp<3.7,>=3.6) (1.3.0)
Requirement already satisfied, skipping upgrade: attrs>=17.3.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from aiohttp<3.7,>=3.6) (19.3.0)
Requirement already satisfied, skipping upgrade: multidict<5.0,>=4.5 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from aiohttp<3.7,>=3.6) (4.5.2)
Requirement already satisfied, skipping upgrade: Jinja2>=2.7 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (2.10.3)
Requirement already satisfied, skipping upgrade: tornado>=4.3 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (6.0.3)
Requirement already satisfied, skipping upgrade: six>=1.5.2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (1.13.0)
Requirement already satisfied, skipping upgrade: pillow>=4.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (6.2.1)
Requirement already satisfied, skipping upgrade: python-dateutil>=2.1 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (2.8.1)
Requirement already satisfied, skipping upgrade: packaging>=16.8 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (19.2)
Requirement already satisfied, skipping upgrade: PyYAML>=3.10 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (5.1.2)
Requirement already satisfied, skipping upgrade: google-auth>=1.2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from gcsfs==0.2.1) (1.7.0)
Requirement already satisfied, skipping upgrade: google-auth-oauthlib in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from gcsfs==0.2.1) (0.4.1)
Requirement already satisfied, skipping upgrade: setuptools in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from hurry.filesize==0.9) (41.6.0.post20191030)
Requirement already satisfied, skipping upgrade: pytz>=2017.2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from pandas<0.26,>0.24) (2019.3)
Requirement already satisfied, skipping upgrade: urllib3<1.25,>=1.21.1 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from requests<2.21.1,>=2.21.0) (1.24.3)
Requirement already satisfied, skipping upgrade: idna<2.9,>=2.5 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from requests<2.21.1,>=2.21.0) (2.8)
Requirement already satisfied, skipping upgrade: certifi>=2017.4.17 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from requests<2.21.1,>=2.21.0) (2019.9.11)
Requirement already satisfied, skipping upgrade: MarkupSafe>=0.23 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from Jinja2>=2.7->bokeh<1.3,>1.1) (1.1.1)
Requirement already satisfied, skipping upgrade: pyparsing>=2.0.2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from packaging>=16.8->bokeh<1.3,>1.1) (2.4.5)
Requirement already satisfied, skipping upgrade: pyasn1-modules>=0.2.1 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from google-auth>=1.2->gcsfs==0.2.1) (0.2.7)
Requirement already satisfied, skipping upgrade: rsa<4.1,>=3.1.4 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from google-auth>=1.2->gcsfs==0.2.1) (4.0)
Requirement already satisfied, skipping upgrade: cachetools<3.2,>=2.0.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from google-auth>=1.2->gcsfs==0.2.1) (3.1.1)
Requirement already satisfied, skipping upgrade: requests-oauthlib>=0.7.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from google-auth-oauthlib->gcsfs==0.2.1) (1.3.0)
Requirement already satisfied, skipping upgrade: pyasn1<0.5.0,>=0.4.6 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from pyasn1-modules>=0.2.1->google-auth>=1.2->gcsfs==0.2.1) (0.4.7)
Requirement already satisfied, skipping upgrade: oauthlib>=3.0.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib->gcsfs==0.2.1) (3.1.0)
/share/ClusterShare/anaconda3/envs/python37/bin/python3 -m pip uninstall -y hail
WARNING: Skipping hail as it is not installed.
/share/ClusterShare/anaconda3/envs/python37/bin/python3 -m pip install build/deploy/dist/hail-0.2.26-py3-none-any.whl --no-deps
Processing ./build/deploy/dist/hail-0.2.26-py3-none-any.whl
Installing collected packages: hail
Successfully installed hail-0.2.26

However, I am still unsuccessful in running hail test scripts:

Fail to execute line 3: hl.init()
Traceback (most recent call last):
  File "/d1/hadoop/yarn/local/usercache/mansop/appcache/application_1572410115474_0110/container_e16_1572410115474_0110_01_000001/tmp/zeppelin_pyspark-4925461021365106531.py", line 380, in <module>
    exec(code, _zcUserQueryNameSpace)
  File "<stdin>", line 3, in <module>
  File "</share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/decorator.py:decorator-gen-1108>", line 2, in init
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/typecheck/check.py", line 585, in wrapper
    return __original_func(*args_, **kwargs_)
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/context.py", line 280, in init
    _optimizer_iterations,_backend)
  File "</share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/decorator.py:decorator-gen-1106>", line 2, in __init__
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/typecheck/check.py", line 585, in wrapper
    return __original_func(*args_, **kwargs_)
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/context.py", line 115, in __init__
    min_block_size, branching_factor, tmp_dir, optimizer_iterations)
TypeError: 'JavaPackage' object is not callable

Any thoughts?

ok, got it working after specifying spark where to find the jar files for hail

thank you