TypeError: 'JavaPackage' object is not callable

While waiting for response I reinstalled with pip hail and installed java-jdk 8.0.92 with conda and ran the following command

import hail as hl
hl.init()

2019-08-06 15:52:54 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
2019-08-06 15:52:54 WARN DependencyUtils:66 - Local jar /build/libs/hail-all-spark.jar does not exist, skipping.
Traceback (most recent call last):
File “”, line 1, in
File “</home/him/miniconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1064>”, line 2, in init
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py”, line 585, in wrapper
return original_func(*args, **kwargs)
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/hail/context.py”, line 264, in init
_optimizer_iterations,_backend)
File “</home/him/miniconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1062>”, line 2, in init
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py”, line 585, in wrapper
return original_func(*args, **kwargs)
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/hail/context.py”, line 99, in init
min_block_size, branching_factor, tmp_dir, optimizer_iterations)
TypeError: ‘JavaPackage’ object is not callable

before reinstalling hail I deleted a folder called hail (which I don’t remember where I got it from) and it looks like now my system does not know where hail is…:frowning:

Any suggestions on what I can try ? thank you

Hmm. Something to try: delete all the environment variables you set and run the following:

HAIL_COMPILE_NATIVES=1 make -C hail/ install

This should create and install a local wheel (pip distribution)

@him26, to be clear, Tim and I suspect your machine’s CPU is missing features on which the pip-installed-version of Hail depends. Can you share the output of cat /proc/cpuinfo?

Tim’s suggestion to download the hail repository and compile from source is your best bet.

@danking Here is the output you asked for it has ~1400 lines and below is first 25 lines. After which is repeat of the same.

processor : 0
vendor_id : GenuineIntel
cpu family : 6
model : 79
model name : Intel® Xeon® CPU E5-2660 v4 @ 2.00GHz
stepping : 1
microcode : 0xb00002a
cpu MHz : 1200.195
cache size : 35840 KB
physical id : 0
siblings : 28
core id : 0
cpu cores : 14
apicid : 0
initial apicid : 0
fpu : yes
fpu_exception : yes
cpuid level : 20
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc aperfmperf eagerfpu pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch epb cat_l3 cdp_l3 intel_ppin intel_pt ibrs ibpb stibp tpr_shadow vnmi flexpriority ept vpid fsgsbase tsc_adjust bmi1 hle avx2 smep bmi2 erms invpcid rtm cqm rdt_a rdseed adx smap xsaveopt cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local dtherm ida arat pln pts spec_ctrl intel_stibp
bogomips : 3991.39
clflush size : 64
cache_alignment : 64
address sizes : 46 bits physical, 48 bits virtual
power management:

will try Tim’s suggestion and report back. Thank you.


I downloaded hail 0.2.19 release version, erased all environment variables and I ran Tim’s suggested commend and the resulting output is as below

fatal: Not a git repository (or any parent up to mount point /home)
Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).
fatal: Not a git repository (or any parent up to mount point /home)
Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).
fatal: Not a git repository (or any parent up to mount point /home)
Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).
make: Entering directory `/home/him/hail/hail’
echo ‘[Build Metadata]’ > src/main/resources/build-info.properties
echo ‘user=him’ >> src/main/resources/build-info.properties
echo ‘revision=’ >> src/main/resources/build-info.properties
echo ‘branch=’ >> src/main/resources/build-info.properties
echo ‘date=2019-08-07T00:59:52Z’ >> src/main/resources/build-info.properties
echo ‘url=’ >> src/main/resources/build-info.properties
echo ‘sparkVersion=2.4.0’ >> src/main/resources/build-info.properties
echo ‘hailPipVersion=0.2.19’ >> src/main/resources/build-info.properties
echo 0.2.19- > python/hail/hail_version
echo 0.2.19 > python/hail/hail_pip_version
cp -f python/hail/hail_version python/hailtop/hail_version
cp -f python/hail/hail_pip_version python/hailtop/hail_pip_version
./gradlew releaseJar
Downloading (edited out due to user limit) gradle-5.4.1-bin.zip

Unzipping /home/him/.gradle/wrapper/dists/gradle-5.4.1-bin/e75iq110yv9r9wt1a6619x2xm/gradle-5.4.1-bin.zip to /home/him/.gradle/wrapper/dists/gradle-5.4.1-bin/e75iq110yv9r9wt1a6619x2xm
Set executable permissions for: /home/him/.gradle/wrapper/dists/gradle-5.4.1-bin/e75iq110yv9r9wt1a6619x2xm/gradle-5.4.1/bin/gradle

Welcome to Gradle 5.4.1!

Here are the highlights of this release:

  • Run builds with JDK12
  • New API for Incremental Tasks
  • Updates to native projects, including Swift 5 support

For more details see (edited out due to user limit)

Starting a Gradle Daemon (subsequent builds will be faster)

Task :nativeLibPrebuilt FAILED
g++: error: unrecognized command line option ‘-std=c++14’
make[1]: Entering directory /home/him/hail/hail/src/main/c' g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux Upcalls.cpp -MG -M -MF build/Upcalls.d -MT build/Upcalls.o g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux testutils/unit-tests.cpp -MG -M -MF build/testutils/unit-tests.d -MT build/testutils/unit-tests.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux test.cpp -MG -M -MF build/test.d -MT build/test.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux Region_test.cpp -MG -M -MF build/Region_test.d -MT build/Region_test.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux Region.cpp -MG -M -MF build/Region.d -MT build/Region.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux PartitionIterators.cpp -MG -M -MF build/PartitionIterators.d -MT build/PartitionIterators.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux ObjectArray.cpp -MG -M -MF build/ObjectArray.d -MT build/ObjectArray.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux NativeStatus.cpp -MG -M -MF build/NativeStatus.d -MT build/NativeStatus.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux NativePtr.cpp -MG -M -MF build/NativePtr.d -MT build/NativePtr.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux NativeModule.cpp -MG -M -MF build/NativeModule.d -MT build/NativeModule.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux NativeLongFunc.cpp -MG -M -MF build/NativeLongFunc.d -MT build/NativeLongFunc.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux NativeCodeSuite.cpp -MG -M -MF build/NativeCodeSuite.d -MT build/NativeCodeSuite.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux NativeBoot.cpp -MG -M -MF build/NativeBoot.d -MT build/NativeBoot.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux Logging.cpp -MG -M -MF build/Logging.d -MT build/Logging.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux ibs.cpp -MG -M -MF build/ibs.d -MT build/ibs.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux FS.cpp -MG -M -MF build/FS.d -MT build/FS.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux Encoder.cpp -MG -M -MF build/Encoder.d -MT build/Encoder.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux Decoder.cpp -MG -M -MF build/Decoder.d -MT build/Decoder.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux davies.cpp -MG -M -MF build/davies.d -MT build/davies.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux cache-tests.cpp -MG -M -MF build/cache-tests.d -MT build/cache-tests.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux ApproximateQuantiles_test.cpp -MG -M -MF build/ApproximateQuantiles_test.d -MT build/ApproximateQuantiles_test.o g++: error: unrecognized command line option ‘-std=c++14’ g++ -o build/NativeBoot.o -march=corei7-avx -O3 -std=c++14 -Ilibsimdpp-2.1 -Wall -Wextra -fPIC -ggdb -fno-strict-aliasing -I../resources/include -I/home/him/miniconda3/envs/hail/include -I/home/him/miniconda3/envs/hail/include/linux -MD -MF build/NativeBoot.d -MT build/NativeBoot.o -c NativeBoot.cpp g++: error: unrecognized command line option ‘-std=c++14’ make[1]: *** [build/NativeBoot.o] Error 1 make[1]: Leaving directory/home/him/hail/hail/src/main/c’

FAILURE: Build failed with an exception.

  • What went wrong:
    Execution failed for task ‘:nativeLibPrebuilt’.

Process ‘command ‘make’’ finished with non-zero exit value 2

  • Try:
    Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

  • Get more help at (edited out due to user limit)

BUILD FAILED in 1m 3s
1 actionable task: 1 executed
make: *** [shadowJar] Error 1
make: Leaving directory `/home/him/hail/hail’Preformatted text

Ok I think I understand and now. When you use pip to install hail, you should not set any of those environment variables. In fact, if any of them are set, Hail might not start correctly.

Your CPU has the flags I expect. Compiling from source might fix the problem but only because the environment variables are for the non-pip-installed version of Hail.

Erased all environment variable. uninstalled and reinstall hail.
Retrying the first tutorial commend. And I get the following some other error… :frowning:

Initializing Spark and Hail with default parameters…
Running on Apache Spark version 2.4.0
SparkUI available at (removed)
Welcome to
__ __ <>__
/ // /__ __/ /
/ __ / _ `/ / /
/
/ //_,/// version 0.2.19-c6ec8b76eb26
LOGGING: writing to /home/him/DATA/hail-20190807-1044-0.2.19-c6ec8b76eb26.log
2019-08-07 10:44:32 Hail: INFO: balding_nichols_model: generating genotypes for 3 populations, 50 samples, and 100 variants…
ERROR:root:Exception while sending command.

Traceback (most recent call last):
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 1159, in send_command
raise Py4JNetworkError(“Answer from Java side is empty”)
py4j.protocol.Py4JNetworkError: Answer from Java side is empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 985, in send_command
response = connection.send_command(command)
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 1164, in send_command
“Error while receiving”, e, proto.ERROR_ON_RECEIVE)
py4j.protocol.Py4JNetworkError: Error while receiving


Py4JError Traceback (most recent call last)
in
1 mt= hl.balding_nichols_model(n_populations=3, n_samples=50, n_variants=100)
----> 2 mt.count()

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/matrixtable.py in count(self)
2418 Number of rows, number of cols.
2419 “”"
-> 2420 return (self.count_rows(), self.count_cols())
2421
2422 @typecheck_method(output=str,

</home/him/miniconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1127> in count_rows(self, _localize)

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py in wrapper(__original_func, *args, **kwargs)
583 def wrapper(original_func, *args, **kwargs):
584 args
, kwargs
= check_all(__original_func, args, kwargs, checkers, is_method=is_method)
–> 585 return original_func(*args, **kwargs)
586
587 return wrapper

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/matrixtable.py in count_rows(self, _localize)
2373 ir = TableCount(MatrixRowsTable(self._mir))
2374 if _localize:
-> 2375 return Env.backend().execute(ir)
2376 else:
2377 return construct_expr(ir, hl.tint64)

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/backend/backend.py in execute(self, ir, timed)
106
107 def execute(self, ir, timed=False):
–> 108 result = json.loads(Env.hc()._jhc.backend().executeJSON(self._to_java_ir(ir)))
109 value = ir.typ._from_json(result[‘value’])
110 timings = result[‘timings’]

~/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py in call(self, *args)
1255 answer = self.gateway_client.send_command(command)
1256 return_value = get_return_value(
-> 1257 answer, self.gateway_client, self.target_id, self.name)
1258
1259 for temp_arg in temp_args:

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/utils/java.py in deco(*args, **kwargs)
207 import pyspark
208 try:
–> 209 return f(*args, **kwargs)
210 except py4j.protocol.Py4JJavaError as e:
211 s = e.java_exception.toString()

~/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)
334 raise Py4JError(
335 “An error occurred while calling {0}{1}{2}”.
–> 336 format(target_id, “.”, name))
337 else:
338 type = answer[1]

Py4JError: An error occurred while calling o50.executeJSON

Can you share this log file?

The true source of the error should be present there.

Here is the link to the log file:
hail.log

The log indicates that Spark was told to shutdown. Sometimes this is caused by an out of memory error. How much RAM does this machine have? Can you try increasing the memory available to Spark with export PYSPARK_SUBMIT_ARGS="--driver-memory 8g pyspark-shell"?

1 mt= hl.balding_nichols_model(n_populations=3, n_samples=50, n_variants=100)
----> 2 mt.count()

This shouldn’t OOM. The log also never saw the IR come in

It has 503G total. I will try your suggestion.

Ok some of the error output changed below is what I get from the notebook side

2019-08-09 11:29:00 Hail: INFO: balding_nichols_model: generating genotypes for 3 populations, 50 samples, and 100 variants…
ERROR:root:Exception while sending command.
Traceback (most recent call last):
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 1159, in send_command
raise Py4JNetworkError(“Answer from Java side is empty”)
py4j.protocol.Py4JNetworkError: Answer from Java side is empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 985, in send_command
response = connection.send_command(command)
File “/home/him/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py”, line 1164, in send_command
“Error while receiving”, e, proto.ERROR_ON_RECEIVE)
py4j.protocol.Py4JNetworkError: Error while receiving

Py4JError Traceback (most recent call last)
in
1 mt= hl.balding_nichols_model(n_populations=3, n_samples=50, n_variants=100)
----> 2 mt.count()

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/matrixtable.py in count(self)
2418 Number of rows, number of cols.
2419 “”"
-> 2420 return (self.count_rows(), self.count_cols())
2421
2422 @typecheck_method(output=str,

</home/him/miniconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1127> in count_rows(self, _localize)

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py in wrapper(__original_func, *args, **kwargs)
583 def wrapper(original_func, *args, **kwargs):
584 args
, kwargs
= check_all(__original_func, args, kwargs, checkers, is_method=is_method)
–> 585 return original_func(*args, **kwargs)
586
587 return wrapper

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/matrixtable.py in count_rows(self, _localize)
2373 ir = TableCount(MatrixRowsTable(self._mir))
2374 if _localize:
-> 2375 return Env.backend().execute(ir)
2376 else:
2377 return construct_expr(ir, hl.tint64)

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/backend/backend.py in execute(self, ir, timed)
106
107 def execute(self, ir, timed=False):
–> 108 result = json.loads(Env.hc()._jhc.backend().executeJSON(self._to_java_ir(ir)))
109 value = ir.typ._from_json(result[‘value’])
110 timings = result[‘timings’]

~/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/java_gateway.py in call(self, *args)
1255 answer = self.gateway_client.send_command(command)
1256 return_value = get_return_value(
-> 1257 answer, self.gateway_client, self.target_id, self.name)
1258
1259 for temp_arg in temp_args:

~/miniconda3/envs/hail/lib/python3.6/site-packages/hail/utils/java.py in deco(*args, **kwargs)
207 import pyspark
208 try:
–> 209 return f(*args, **kwargs)
210 except py4j.protocol.Py4JJavaError as e:
211 s = e.java_exception.toString()

~/miniconda3/envs/hail/lib/python3.6/site-packages/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)
334 raise Py4JError(
335 “An error occurred while calling {0}{1}{2}”.
–> 336 format(target_id, “.”, name))
337 else:
338 type = answer[1]

Py4JError: An error occurred while calling o51.executeJSON

below is what I see on the terminal

[I 11:28:44.633 NotebookApp] Adapting from protocol version 5.1 (kernel 154409c5-822d-4413-aadc-d5ba0c8be324) to 5.3 (client).
2019-08-09 11:28:56 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
ERROR: dlopen("/tmp/libhail6261548975316297961.so"): /lib64/libstdc++.so.6: version CXXABI_1.3.8' not found (required by /tmp/libhail6261548975316297961.so) FATAL: caught exception java.lang.UnsatisfiedLinkError: /tmp/libhail6261548975316297961.so: /lib64/libstdc++.so.6: versionCXXABI_1.3.8’ not found (required by /tmp/libhail6261548975316297961.so)
java.lang.UnsatisfiedLinkError: /tmp/libhail6261548975316297961.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.8’ not found (required by /tmp/libhail6261548975316297961.so)
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at is.hail.nativecode.NativeCode.(NativeCode.java:30)
at is.hail.nativecode.NativeBase.(NativeBase.scala:20)
at is.hail.annotations.Region.(Region.scala:175)
at is.hail.annotations.Region$.apply(Region.scala:16)
at is.hail.annotations.Region$.scoped(Region.scala:18)
at is.hail.expr.ir.ExecuteContext$.scoped(ExecuteContext.scala:7)
at is.hail.backend.Backend.execute(Backend.scala:86)
at is.hail.backend.Backend.executeJSON(Backend.scala:92)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:745)

Trying out your settting did cause some changes. It displays that it’s using 4.1gb of memory. But did not solve the problem. Below is the hail.log file output. I am only posting the ending part where it might be relevant. Any other suggestions? Thank you.

2019-08-09 11:28:59 Hail: INFO: Running Hail version 0.2.19-c6ec8b76eb26
2019-08-09 11:28:59 MemoryStore: INFO: Block broadcast_0 stored as values in memory (estimated size 33.5 KB, free 4.1 GB)
2019-08-09 11:28:59 MemoryStore: INFO: Block broadcast_0_piece0 stored as bytes in memory (estimated size 2.9 KB, free 4.1 GB)
2019-08-09 11:28:59 BlockManagerInfo: INFO: Added broadcast_0_piece0 in memory on test:35006 (size: 2.9 KB, free: 4.1 GB)
2019-08-09 11:28:59 SparkContext: INFO: Created broadcast 0 from broadcast at SparkBackend.scala:26
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_1 stored as values in memory (estimated size 1643.2 KB, free 4.1 GB)
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_1_piece0 stored as bytes in memory (estimated size 108.4 KB, free 4.1 GB)
2019-08-09 11:29:00 BlockManagerInfo: INFO: Added broadcast_1_piece0 in memory on test:35006 (size: 108.4 KB, free: 4.1 GB)
2019-08-09 11:29:00 SparkContext: INFO: Created broadcast 1 from broadcast at SparkBackend.scala:26
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_2 stored as values in memory (estimated size 8.7 KB, free 4.1 GB)
2019-08-09 11:29:00 MemoryStore: INFO: Block broadcast_2_piece0 stored as bytes in memory (estimated size 1000.0 B, free 4.1 GB)
2019-08-09 11:29:00 BlockManagerInfo: INFO: Added broadcast_2_piece0 in memory on test:35006 (size: 1000.0 B, free: 4.1 GB)
2019-08-09 11:29:00 SparkContext: INFO: Created broadcast 2 from broadcast at SparkBackend.scala:26
2019-08-09 11:29:00 SparkContext: WARN: Using an existing SparkContext; some configuration may not take effect.
2019-08-09 11:29:00 Hail: INFO: balding_nichols_model: generating genotypes for 3 populations, 50 samples, and 100 variants…
2019-08-09 11:29:01 SparkContext: INFO: Invoking stop() from shutdown hook
2019-08-09 11:29:01 AbstractConnector: INFO: Stopped Spark@1d783d20{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2019-08-09 11:29:01 SparkUI: INFO: Stopped Spark web UI at http://test:4040
2019-08-09 11:29:01 MapOutputTrackerMasterEndpoint: INFO: MapOutputTrackerMasterEndpoint stopped!
2019-08-09 11:29:01 MemoryStore: INFO: MemoryStore cleared
2019-08-09 11:29:01 BlockManager: INFO: BlockManager stopped
2019-08-09 11:29:01 BlockManagerMaster: INFO: BlockManagerMaster stopped
2019-08-09 11:29:01 OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: INFO: OutputCommitCoordinator stopped!
2019-08-09 11:29:01 SparkContext: INFO: Successfully stopped SparkContext
2019-08-09 11:29:01 ShutdownHookManager: INFO: Shutdown hook called
2019-08-09 11:29:01 ShutdownHookManager: INFO: Deleting directory /tmp/spark-1bd29d38-6dd0-4562-aae1-2b598144c213
2019-08-09 11:29:01 ShutdownHookManager: INFO: Deleting directory /tmp/spark-332bfc01-8301-4c33-af4c-d1d36e1ac9a0

Hi all! I am having the same issue.

My configuration follows:

conda activate hail
export SPARK_HOME=/Users/chernandez/Software/spark-2.4.4-bin-hadoop2.7
export PATH=/Users/chernandez/Software/spark-2.4.4-bin-hadoop2.7/bin:$PATH
export HAIL_HOME=$(pip show hail | grep Location | awk -F' ' '{print $2 "/hail"}')
export PYTHONPATH="$HAIL_HOME:$SPARK_HOME/python:`echo $SPARK_HOME/python/lib/py4j*-src.zip`:$PYTHONPATH"
export SPARK_CLASSPATH=$HAIL_HOME/hail-all-spark.jar

The error is raised on init:

import hail as hl
main_conf = # ... #
spark_conf = SparkConf().setAppName(APP_NAME).set('spark.executor.cores', cores)
spark = SparkSession.builder.config(conf=spark_conf).getOrCreate()
spark.sparkContext._jsc.hadoopConfiguration().setInt("dfs.block.size", main_conf["dfs_block_size"])
spark.sparkContext._jsc.hadoopConfiguration().setInt("parquet.block.size", main_conf["dfs_block_size"])
hl.init(spark.sparkContext)

The full trace of the error is:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "</Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1078>", line 2, in init
  File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py", line 585, in wrapper
    return __original_func(*args_, **kwargs_)
   File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/context.py", line 264, in init
_optimizer_iterations,_backend)
   File "</Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/decorator.py:decorator-gen-1076>", line 2, in __init__
  File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/typecheck/check.py", line 585, in wrapper
return __original_func(*args_, **kwargs_)
  File "/Users/.../.anaconda3/envs/hail/lib/python3.6/site-packages/hail/context.py", line 99, in __init__
    min_block_size, branching_factor, tmp_dir, optimizer_iterations)
TypeError: 'JavaPackage' object is not callable

My main headache is that the “same” set-up is working fine in Ubuntu but not in Mac OS :cry:

For MacOS we’d definitely recommend the pip installation route – much easier.

I installed hail using pip as indicated in the documentation (aka. https://hail.is/docs/0.2/getting_started.html).

can you try removing this bit and running again to see if you get the same error?

@carleshf Pip hail does not work with already created SparkContext. Can you use PYSPARK_SUBMIT_ARGS to specify your spark options and start hail as:

import hail as hl
hl.init()

Hi,
I am experiencing the same issue.
I am trying to run hail in cluster mode. I built from source code against spark 2.3.2.

/usr/bin/spark-submit --version
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.3.2.3.1.0.0-78
      /_/

Using Scala version 2.11.8, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_112
Branch HEAD
Compiled by user jenkins on 2018-12-06T12:26:34Z
Revision 9b78096afddf26e2d73f0c078a112c9bf979ed53
Url git@github.com:hortonworks/spark2.git
Type --help for more information.

This is how I built hail
sudo make install-on-cluster HAIL_COMPILE_NATIVES=1 SPARK_VERSION=2.3.2

This is how I installed hail in my python environment
sudo /share/ClusterShare/anaconda3/envs/python37/bin/pip install /share/apps/luffy/hail/hail/build/deploy

This is the testing code

import hail as hl
hl.init()

And this is the error I am getting

Fail to execute line 3: hl.init()
Traceback (most recent call last):
  File "/d1/hadoop/yarn/local/usercache/mansop/appcache/application_1572410115474_0103/container_e16_1572410115474_0103_01_000001/tmp/zeppelin_pyspark-2391960415257049015.py", line 380, in <module>
    exec(code, _zcUserQueryNameSpace)
  File "<stdin>", line 3, in <module>
  File "</share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/decorator.py:decorator-gen-1108>", line 2, in init
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/typecheck/check.py", line 585, in wrapper
    return __original_func(*args_, **kwargs_)
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/context.py", line 280, in init
    _optimizer_iterations,_backend)
  File "</share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/decorator.py:decorator-gen-1106>", line 2, in __init__
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/typecheck/check.py", line 585, in wrapper
    return __original_func(*args_, **kwargs_)
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/context.py", line 115, in __init__
    min_block_size, branching_factor, tmp_dir, optimizer_iterations)
TypeError: 'JavaPackage' object is not callable

I am stuck because I don’t understand the meaning of this error or how to keep troubleshooting.

Could someone please share some thoughts?

NOTE: I am using Zeppelin web notebook to run the test code

thank you very much

This is how I built hail
sudo make install-on-cluster HAIL_COMPILE_NATIVES=1 SPARK_VERSION=2.3.2

This is how I installed hail in my python environment
sudo /share/ClusterShare/anaconda3/envs/python37/bin/pip install /share/apps/luffy/hail/hail/build/deploy

You shouldn’t need an additional pip install – the install-on-cluster target includes a pip install:

.PHONY: install-on-cluster
install-on-cluster: $(WHEEL)
	sed '/^pyspark/d' python/requirements.txt | xargs $(PIP) install -U
	-$(PIP) uninstall -y hail
	$(PIP) install $(WHEEL) --no-deps

However, if it’s picking up the wrong pip that could be why. Try that make target again including the following env variable:

HAIL_PYTHON3="/share/ClusterShare/anaconda3/envs/python37/bin/python3"

Hi,

as suggested I uninstalled hail through pip, then deleted the local git repo and downloaded it again. Then I installed using the command below:

$ sudo make install-on-cluster HAIL_COMPILE_NATIVES=1 SPARK_VERSION=2.3.2 HAIL_PYTHON3="/share/ClusterShare/anaconda3/envs/python37/bin/python3"
...
removing build/bdist.linux-x86_64/wheel
sed '/^pyspark/d' python/requirements.txt | xargs /share/ClusterShare/anaconda3/envs/python37/bin/python3 -m pip install -U
Requirement already up-to-date: aiohttp<3.7,>=3.6 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (3.6.2)
Requirement already up-to-date: aiohttp_session<2.8,>=2.7 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (2.7.0)
Requirement already up-to-date: asyncinit<0.3,>=0.2.4 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.2.4)
Requirement already up-to-date: bokeh<1.3,>1.1 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (1.2.0)
Requirement already up-to-date: decorator<5 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (4.4.1)
Requirement already up-to-date: gcsfs==0.2.1 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.2.1)
Requirement already up-to-date: hurry.filesize==0.9 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.9)
Requirement already up-to-date: nest_asyncio in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (1.2.0)
Requirement already up-to-date: numpy<2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (1.17.4)
Requirement already up-to-date: pandas<0.26,>0.24 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.25.3)
Requirement already up-to-date: parsimonious<0.9 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.8.1)
Requirement already up-to-date: PyJWT in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (1.7.1)
Requirement already up-to-date: python-json-logger==0.1.11 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.1.11)
Requirement already up-to-date: requests<2.21.1,>=2.21.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (2.21.0)
Requirement already up-to-date: scipy<1.4,>1.2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (1.3.2)
Requirement already up-to-date: tabulate==0.8.3 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (0.8.3)
Requirement already satisfied, skipping upgrade: async-timeout<4.0,>=3.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from aiohttp<3.7,>=3.6) (3.0.1)
Requirement already satisfied, skipping upgrade: chardet<4.0,>=2.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from aiohttp<3.7,>=3.6) (3.0.4)
Requirement already satisfied, skipping upgrade: yarl<2.0,>=1.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from aiohttp<3.7,>=3.6) (1.3.0)
Requirement already satisfied, skipping upgrade: attrs>=17.3.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from aiohttp<3.7,>=3.6) (19.3.0)
Requirement already satisfied, skipping upgrade: multidict<5.0,>=4.5 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from aiohttp<3.7,>=3.6) (4.5.2)
Requirement already satisfied, skipping upgrade: Jinja2>=2.7 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (2.10.3)
Requirement already satisfied, skipping upgrade: tornado>=4.3 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (6.0.3)
Requirement already satisfied, skipping upgrade: six>=1.5.2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (1.13.0)
Requirement already satisfied, skipping upgrade: pillow>=4.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (6.2.1)
Requirement already satisfied, skipping upgrade: python-dateutil>=2.1 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (2.8.1)
Requirement already satisfied, skipping upgrade: packaging>=16.8 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (19.2)
Requirement already satisfied, skipping upgrade: PyYAML>=3.10 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from bokeh<1.3,>1.1) (5.1.2)
Requirement already satisfied, skipping upgrade: google-auth>=1.2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from gcsfs==0.2.1) (1.7.0)
Requirement already satisfied, skipping upgrade: google-auth-oauthlib in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from gcsfs==0.2.1) (0.4.1)
Requirement already satisfied, skipping upgrade: setuptools in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from hurry.filesize==0.9) (41.6.0.post20191030)
Requirement already satisfied, skipping upgrade: pytz>=2017.2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from pandas<0.26,>0.24) (2019.3)
Requirement already satisfied, skipping upgrade: urllib3<1.25,>=1.21.1 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from requests<2.21.1,>=2.21.0) (1.24.3)
Requirement already satisfied, skipping upgrade: idna<2.9,>=2.5 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from requests<2.21.1,>=2.21.0) (2.8)
Requirement already satisfied, skipping upgrade: certifi>=2017.4.17 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from requests<2.21.1,>=2.21.0) (2019.9.11)
Requirement already satisfied, skipping upgrade: MarkupSafe>=0.23 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from Jinja2>=2.7->bokeh<1.3,>1.1) (1.1.1)
Requirement already satisfied, skipping upgrade: pyparsing>=2.0.2 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from packaging>=16.8->bokeh<1.3,>1.1) (2.4.5)
Requirement already satisfied, skipping upgrade: pyasn1-modules>=0.2.1 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from google-auth>=1.2->gcsfs==0.2.1) (0.2.7)
Requirement already satisfied, skipping upgrade: rsa<4.1,>=3.1.4 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from google-auth>=1.2->gcsfs==0.2.1) (4.0)
Requirement already satisfied, skipping upgrade: cachetools<3.2,>=2.0.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from google-auth>=1.2->gcsfs==0.2.1) (3.1.1)
Requirement already satisfied, skipping upgrade: requests-oauthlib>=0.7.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from google-auth-oauthlib->gcsfs==0.2.1) (1.3.0)
Requirement already satisfied, skipping upgrade: pyasn1<0.5.0,>=0.4.6 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from pyasn1-modules>=0.2.1->google-auth>=1.2->gcsfs==0.2.1) (0.4.7)
Requirement already satisfied, skipping upgrade: oauthlib>=3.0.0 in /share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib->gcsfs==0.2.1) (3.1.0)
/share/ClusterShare/anaconda3/envs/python37/bin/python3 -m pip uninstall -y hail
WARNING: Skipping hail as it is not installed.
/share/ClusterShare/anaconda3/envs/python37/bin/python3 -m pip install build/deploy/dist/hail-0.2.26-py3-none-any.whl --no-deps
Processing ./build/deploy/dist/hail-0.2.26-py3-none-any.whl
Installing collected packages: hail
Successfully installed hail-0.2.26

However, I am still unsuccessful in running hail test scripts:

Fail to execute line 3: hl.init()
Traceback (most recent call last):
  File "/d1/hadoop/yarn/local/usercache/mansop/appcache/application_1572410115474_0110/container_e16_1572410115474_0110_01_000001/tmp/zeppelin_pyspark-4925461021365106531.py", line 380, in <module>
    exec(code, _zcUserQueryNameSpace)
  File "<stdin>", line 3, in <module>
  File "</share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/decorator.py:decorator-gen-1108>", line 2, in init
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/typecheck/check.py", line 585, in wrapper
    return __original_func(*args_, **kwargs_)
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/context.py", line 280, in init
    _optimizer_iterations,_backend)
  File "</share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/decorator.py:decorator-gen-1106>", line 2, in __init__
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/typecheck/check.py", line 585, in wrapper
    return __original_func(*args_, **kwargs_)
  File "/share/ClusterShare/anaconda3/envs/python37/lib/python3.7/site-packages/hail/context.py", line 115, in __init__
    min_block_size, branching_factor, tmp_dir, optimizer_iterations)
TypeError: 'JavaPackage' object is not callable

Any thoughts?

ok, got it working after specifying spark where to find the jar files for hail

thank you