TypeError: 'JavaPackage' object is not callable

1 mt= hl.balding_nichols_model(n_populations=3, n_samples=50, n_variants=100)
----> 2 mt.count()

This shouldn’t OOM. The log also never saw the IR come in

It has 503G total. I will try your suggestion.

Ok some of the error output changed below is what I get from the notebook side

2019-08-09 11:29:00 Hail: INFO: balding_nichols_model: generating genotypes for 3 populations, 50 samples, and 100 variants…
ERROR:root:Exception while sending command.
Traceback (most recent call last):
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 1159, in send_command
raise Py4JNetworkError(“Answer from Java side is empty”)
py4j.protocol.Py4JNetworkError: Answer from Java side is empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 985, in send_command
response = connection.send_command(command)
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 1164, in send_command
“Error while receiving”, e, proto.ERROR_ON_RECEIVE)
py4j.protocol.Py4JNetworkError: Error while receiving

Py4JError Traceback (most recent call last)
in
1 mt= hl.balding_nichols_model(n_populations=3, n_samples=50, n_variants=100)
----> 2 mt.count()

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/matrixtable.py in count(self)
2418 Number of rows, number of cols.
2419 “”"
-> 2420 return (self.count_rows(), self.count_cols())
2421
2422 @typecheck_method(output=str,

</home/him/miniconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1127> in count_rows(self, _localize)

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py in wrapper(__original_func, *args, **kwargs)
583 def wrapper(original_func, *args, **kwargs):
584 args
, kwargs
= check_all(__original_func, args, kwargs, checkers, is_method=is_method)
–> 585 return original_func(*args, **kwargs)
586
587 return wrapper

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/matrixtable.py in count_rows(self, _localize)
2373 ir = TableCount(MatrixRowsTable(self._mir))
2374 if _localize:
-> 2375 return Env.backend().execute(ir)
2376 else:
2377 return construct_expr(ir, hl.tint64)

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/backend/backend.py in execute(self, ir, timed)
106
107 def execute(self, ir, timed=False):
–> 108 result = json.loads(Env.hc()._jhc.backend().executeJSON(self._to_java_ir(ir)))
109 value = ir.typ._from_json(result[‘value’])
110 timings = result[‘timings’]

~/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py in call(self, *args)
1255 answer = self.gateway_client.send_command(command)
1256 return_value = get_return_value(
-> 1257 answer, self.gateway_client, self.target_id, self.name)
1258
1259 for temp_arg in temp_args:

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/utils/java.py in deco(*args, **kwargs)
207 import pyspark
208 try:
–> 209 return f(*args, **kwargs)
210 except py4j.protocol.Py4JJavaError as e:
211 s = e.java_exception.toString()

~/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)
334 raise Py4JError(
335 “An error occurred while calling {0}{1}{2}”.
–> 336 format(target_id, “.”, name))
337 else:
338 type = answer[1]

Py4JError: An error occurred while calling o51.executeJSON

below is what I see on the terminal

[I 11:28:44.633 NotebookApp] Adapting from protocol version 5.1 (kernel 154409c5-822d-4413-aadc-d5ba0c8be324) to 5.3 (client).
2019-08-09 11:28:56 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
ERROR: dlopen("/tmp/libhail6261548975316297961.so"): /lib64/libstdc++.so.6: version CXXABI_1.3.8' not found (required by /tmp/libhail6261548975316297961.so) FATAL: caught exception java.lang.UnsatisfiedLinkError: /tmp/libhail6261548975316297961.so: /lib64/libstdc++.so.6: versionCXXABI_1.3.8’ not found (required by /tmp/libhail6261548975316297961.so)
java.lang.UnsatisfiedLinkError: /tmp/libhail6261548975316297961.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.8’ not found (required by /tmp/libhail6261548975316297961.so)
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at is.hail.nativecode.NativeCode.(NativeCode.java:30)
at is.hail.nativecode.NativeBase.(NativeBase.scala:20)
at is.hail.annotations.Region.(Region.scala:175)
at is.hail.annotations.Region$.apply(Region.scala:16)
at is.hail.annotations.Region$.scoped(Region.scala:18)
at is.hail.expr.ir.ExecuteContext$.scoped(ExecuteContext.scala:7)
at is.hail.backend.Backend.execute(Backend.scala:86)
at is.hail.backend.Backend.executeJSON(Backend.scala:92)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:745)

Trying out your settting did cause some changes. It displays that it’s using 4.1gb of memory. But did not solve the problem. Below is the hail.log file output. I am only posting the ending part where it might be relevant. Any other suggestions? Thank you.

2019-08-09 11:28:59 Hail: INFO: Running Hail version 0.2.19-c6ec8b76eb26
2019-08-09 11:28:59 MemoryStore: INFO: Block broadcast_0 stored as values in memory (estimated size 33.5 KB, free 4.1 GB)
2019-08-09 11:28:59 MemoryStore: INFO: Block broadcast_0_piece0 stored as bytes in memory (estimated size 2.9 KB, free 4.1 GB)
2019-08-09 11:28:59 BlockManagerInfo: INFO: Added broadcast_0_piece0 in memory on test:35006 (size: 2.9 KB, free: 4.1 GB)
2019-08-09 11:28:59 SparkContext: INFO: Created broadcast 0 from broadcast at SparkBackend.scala:26
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_1 stored as values in memory (estimated size 1643.2 KB, free 4.1 GB)
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_1_piece0 stored as bytes in memory (estimated size 108.4 KB, free 4.1 GB)
2019-08-09 11:29:00 BlockManagerInfo: INFO: Added broadcast_1_piece0 in memory on test:35006 (size: 108.4 KB, free: 4.1 GB)
2019-08-09 11:29:00 SparkContext: INFO: Created broadcast 1 from broadcast at SparkBackend.scala:26
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_2 stored as values in memory (estimated size 8.7 KB, free 4.1 GB)
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1000.0 B, free 4.1 GB)
2019-08-09 11:29:00 BlockManagerInfo: INFO: Added broadcast_2_piece0 in memory on test:35006 (size: 1000.0 B, free: 4.1 GB)
2019-08-09 11:29:00 SparkContext: INFO: Created broadcast 2 from broadcast at SparkBackend.scala:26
2019-08-09 11:29:00 SparkContext: WARN: Using an existing SparkContext; some configuration may not take effect.
2019-08-09 11:29:00 Hail: INFO: balding_nichols_model: generating genotypes for 3 populations, 50 samples, and 100 variants…
2019-08-09 11:29:01 SparkContext: INFO: Invoking stop() from shutdown hook
2019-08-09 11:29:01 AbstractConnector: INFO: Stopped Spark@1d783d20{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-08-09 11:29:01 SparkUI: INFO: Stopped Spark web UI at http://test:4040
2019-08-09 11:29:01 MapOutputTrackerMasterEndpoint: INFO: MapOutputTrackerMasterEndpoint stopped!
2019-08-09 11:29:01 MemoryStore: INFO: MemoryStore cleared
2019-08-09 11:29:01 BlockManager: INFO: BlockManager stopped
2019-08-09 11:29:01 BlockManagerMaster: INFO: BlockManagerMaster stopped
2019-08-09 11:29:01 OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: INFO: OutputCommitCoordinator stopped!
2019-08-09 11:29:01 SparkContext: INFO: Successfully stopped SparkContext
2019-08-09 11:29:01 ShutdownHookManager: INFO: Shutdown hook called
2019-08-09 11:29:01 ShutdownHookManager: INFO: Deleting directory /tmp/spark-1bd29d38-6dd0-4562-aae1-2b598144c213
2019-08-09 11:29:01 ShutdownHookManager: INFO: Deleting directory /tmp/spark-332bfc01-8301-4c33-af4c-d1d36e1ac9a0

Hi all! I am having the same issue.

My configuration follows:

conda activate hail
export SPARK_HOME=/Users/chernandez/Software/spark-2.4.4-bin-hadoop2.7
export PATH=/Users/chernandez/Software/spark-2.4.4-bin-hadoop2.7/bin:$PATH
export HAIL_HOME=$(pip show hail | grep Location | awk -F' ' '{print $2 "/hail"}')
export PYTHONPATH="$HAIL_HOME:$SPARK_HOME/python:`echo $SPARK_HOME/python/lib/py4j*-src.zip`:$PYTHONPATH"
export SPARK_CLASSPATH=$HAIL_HOME/hail-all-spark.jar

The error is raised on init:

import hail as hl
main_conf = # ... #
spark_conf = SparkConf().setAppName(APP_NAME).set('spark.executor.cores', cores)
spark = SparkSession.builder.config(conf=spark_conf).getOrCreate()
spark.sparkContext._jsc.hadoopConfiguration().setInt("dfs.block.size", main_conf["dfs_block_size"])
spark.sparkContext._jsc.hadoopConfiguration().setInt("parquet.block.size", main_conf["dfs_block_size"])
hl.init(spark.sparkContext)

The full trace of the error is:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "</Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1078>", line 2, in init
  File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py", line 585, in wrapper
    return __original_func(*args_, **kwargs_)
   File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/context.py", line 264, in init
_optimizer_iterations,_backend)
   File "</Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1076>", line 2, in __init__
  File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py", line 585, in wrapper
return __original_func(*args_, **kwargs_)
  File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/context.py", line 99, in __init__
    min_block_size, branching_factor, tmp_dir, optimizer_iterations)
TypeError: 'JavaPackage' object is not callable

My main headache is that the “same” set-up is working fine in Ubuntu but not in Mac OS :cry:

For MacOS we’d definitely recommend the pip installation route – much easier.

I installed hail using pip as indicated in the documentation (aka. https://hail.is/docs/0.2/getting_started.html).

can you try removing this bit and running again to see if you get the same error?

@carleshf Pip hail does not work with already created SparkContext. Can you use PYSPARK_SUBMIT_ARGS to specify your spark options and start hail as:

import hail as hl
hl.init()