java.lang.UnsatisfiedLinkError: is.hail.annotations.Region.nativeCtor()

Hi! I’m brand new to hail. I’m trhying to get the tutorial working “01-genome-wide-association-study”, but I’m stuck on this line in the notebook: “hl.utils.get_1kg(‘data/’)”, with the follwoing error:

2018-07-15 22:23:13 Hail: INFO: downloading 1KG VCF ...
  Source: https://storage.googleapis.com/hail-tutorial/1kg.vcf.bgz
2018-07-15 22:23:14 Hail: INFO: importing VCF and writing to matrix table...
---------------------------------------------------------------------------
FatalError                                Traceback (most recent call last)
<ipython-input-3-414286e92795> in <module>()
----> 1 hl.utils.get_1kg('data/')

~/hail/python/hail/utils/tutorial.py in get_1kg(output_dir, overwrite)
     66         cluster_readable_vcf = Env.jutils().copyToTmp(jhc, local_path_uri(tmp_vcf), 'vcf')
     67         info('importing VCF and writing to matrix table...')
---> 68         hl.import_vcf(cluster_readable_vcf, min_partitions=16).write(matrix_table_path, overwrite=True)
     69 
     70         tmp_annot = os.path.join(tmp_dir, '1kg_annotations.txt')

~/hail/python/hail/typecheck/check.py in wrapper(*args, **kwargs)
    545         def wrapper(*args, **kwargs):
    546             args_, kwargs_ = check_all(f, args, kwargs, checkers, is_method=is_method)
--> 547             return f(*args_, **kwargs_)
    548 
    549         update_wrapper(wrapper, f)

~/hail/python/hail/methods/impex.py in import_vcf(path, force, force_bgz, header_file, min_partitions, drop_samples, call_fields, reference_genome, contig_recoding, array_elements_required, skip_invalid_loci)
   1799                                    joption(min_partitions), drop_samples, jset_args(call_fields),
   1800                                    joption(rg), joption(contig_recoding), array_elements_required,
-> 1801                                    skip_invalid_loci)
   1802 
   1803     return MatrixTable(jmt)

/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py in __call__(self, *args)
   1131         answer = self.gateway_client.send_command(command)
   1132         return_value = get_return_value(
-> 1133             answer, self.gateway_client, self.target_id, self.name)
   1134 
   1135         for temp_arg in temp_args:

~/hail/python/hail/utils/java.py in deco(*args, **kwargs)
    194             raise FatalError('%s\n\nJava stack trace:\n%s\n'
    195                              'Hail version: %s\n'
--> 196                              'Error summary: %s' % (deepest, full, hail.__version__, deepest)) from None
    197         except pyspark.sql.utils.CapturedException as e:
    198             raise FatalError('%s\n\nJava stack trace:\n%s\n'

FatalError: UnsatisfiedLinkError: is.hail.annotations.Region.nativeCtor()V

Java stack trace:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 2 in stage 1.0 failed 1 times, most recent failure: Lost task 2.0 in stage 1.0 (TID 3, localhost, executor driver): java.lang.UnsatisfiedLinkError: is.hail.annotations.Region.nativeCtor()V
	at is.hail.annotations.Region.nativeCtor(Native Method)
	at is.hail.annotations.Region.<init>(Region.scala:35)
	at is.hail.annotations.Region$.apply(Region.scala:15)
	at is.hail.rvd.RVDContext$.default(RVDContext.scala:8)
	at is.hail.rvd.package$RVDContextIsPointed$.point(package.scala:8)
	at is.hail.rvd.package$RVDContextIsPointed$.point(package.scala:6)
	at is.hail.sparkextras.ContextRDD$Weaken$$anonfun$apply$4.apply(ContextRDD.scala:64)
	at is.hail.sparkextras.ContextRDD$Weaken$$anonfun$apply$4.apply(ContextRDD.scala:64)
	at is.hail.sparkextras.ContextRDD.is$hail$sparkextras$ContextRDD$$sparkManagedContext(ContextRDD.scala:129)
	at is.hail.sparkextras.ContextRDD$$anonfun$run$1.apply(ContextRDD.scala:138)
	at is.hail.sparkextras.ContextRDD$$anonfun$run$1.apply(ContextRDD.scala:137)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1499)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1487)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1486)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1486)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:814)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1714)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1669)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1658)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2062)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2087)
	at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:936)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
	at org.apache.spark.rdd.RDD.collect(RDD.scala:935)
	at is.hail.sparkextras.ContextRDD.collect(ContextRDD.scala:143)
	at is.hail.rvd.OrderedRVD$.getPartitionKeyInfo(OrderedRVD.scala:563)
	at is.hail.rvd.OrderedRVD$.coerce(OrderedRVD.scala:656)
	at is.hail.rvd.OrderedRVD$.coerce(OrderedRVD.scala:640)
	at is.hail.io.vcf.LoadVCF$.apply(LoadVCF.scala:911)
	at is.hail.HailContext$$anonfun$importVCFs$2.apply(HailContext.scala:641)
	at is.hail.HailContext$$anonfun$importVCFs$2.apply(HailContext.scala:639)
	at is.hail.HailContext.forceBGZip(HailContext.scala:604)
	at is.hail.HailContext.importVCFs(HailContext.scala:639)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:280)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:214)
	at java.lang.Thread.run(Thread.java:745)java.lang.UnsatisfiedLinkError: is.hail.annotations.Region.nativeCtor()V
	at is.hail.annotations.Region.nativeCtor(Native Method)
	at is.hail.annotations.Region.<init>(Region.scala:35)
	at is.hail.annotations.Region$.apply(Region.scala:15)
	at is.hail.rvd.RVDContext$.default(RVDContext.scala:8)
	at is.hail.rvd.package$RVDContextIsPointed$.point(package.scala:8)
	at is.hail.rvd.package$RVDContextIsPointed$.point(package.scala:6)
	at is.hail.sparkextras.ContextRDD$Weaken$$anonfun$apply$4.apply(ContextRDD.scala:64)
	at is.hail.sparkextras.ContextRDD$Weaken$$anonfun$apply$4.apply(ContextRDD.scala:64)
	at is.hail.sparkextras.ContextRDD.is$hail$sparkextras$ContextRDD$$sparkManagedContext(ContextRDD.scala:129)
	at is.hail.sparkextras.ContextRDD$$anonfun$run$1.apply(ContextRDD.scala:138)
	at is.hail.sparkextras.ContextRDD$$anonfun$run$1.apply(ContextRDD.scala:137)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)


Hail version: devel-2d767347e1ea
Error summary: UnsatisfiedLinkError: is.hail.annotations.Region.nativeCtor()V

we must have broken our deployment for MacOS. I’ll have a fix asap.

https://github.com/hail-is/hail/issues/3934
should fix

(there will probably be a fixed distribution available for download in a few hours)

Thanks Tim!
Where do I download the new dist from?
The version in https://storage.googleapis.com/hail-common/distributions/devel/Hail-devel-f7631a0c96cd-Spark-2.2.0.zip still gives me the same error.

ok, that’s the version that should have fixed it. Hmm.

What operating system are you using?

macOS Sierra
10.12.6 (16G1510)
MacBook Pro (Retina, 15-inch, Mid 2015)
Processor: 2.5 GHz Intel Core i7

Hail version: devel-f7631a0c96cd

(hail) mn-mklosi:hail mklosi$ python --version
Python 3.6.6 :: Anaconda, Inc.

(hail) mn-mklosi:hail mklosi$ conda list
# packages in environment at /Users/mklosi/miniconda/envs/hail:
#
# Name                    Version                   Build  Channel
appnope                   0.1.0            py36hf537a9a_0  
backcall                  0.1.0                    py36_0  
blas                      1.0                         mkl  
bleach                    2.1.3                    py36_0  
bokeh                     0.13.0                   py36_0  
ca-certificates           2018.03.07                    0  
certifi                   2018.4.16                py36_0  
cycler                    0.10.0           py36hfc81398_0  
dbus                      1.13.2               h760590f_1  
decorator                 4.3.0                    py36_0  
entrypoints               0.2.3                    py36_2  
expat                     2.2.5                hb8e80ba_0  
freetype                  2.9.1                hb4e5f40_0  
gettext                   0.19.8.1             h15daf44_3  
glib                      2.56.1               h35bc53a_0  
html5lib                  1.0.1                    py36_0  
icu                       58.2                 h4b95b61_1  
intel-openmp              2018.0.3                      0  
ipykernel                 4.8.2                    py36_0  
ipython                   6.4.0                    py36_1  
ipython_genutils          0.2.0            py36h241746c_0  
ipywidgets                7.2.1                    py36_0  
jedi                      0.12.0                   py36_1  
jinja2                    2.10                     py36_0  
jpeg                      9b                   he5867d9_2  
jsonschema                2.6.0            py36hb385e00_0  
jupyter                   1.0.0                    py36_4  
jupyter_client            5.2.3                    py36_0  
jupyter_console           5.2.0                    py36_1  
jupyter_core              4.4.0                    py36_0  
kiwisolver                1.0.1            py36h0a44026_0  
libcxx                    4.0.1                h579ed51_0  
libcxxabi                 4.0.1                hebd6815_0  
libedit                   3.1.20170329         hb402a30_2  
libffi                    3.2.1                h475c297_4  
libgfortran               3.0.1                h93005f0_2  
libiconv                  1.15                 hdd342a3_7  
libpng                    1.6.34               he12f830_0  
libsodium                 1.0.16               h3efe00b_0  
markupsafe                1.0              py36h1de35cc_1  
matplotlib                2.2.2            py36hbf02d85_2  
mistune                   0.8.3            py36h1de35cc_1  
mkl                       2018.0.3                      1  
mkl_fft                   1.0.2            py36h6b9c3cc_0  
mkl_random                1.0.1            py36h5d10147_1  
nbconvert                 5.3.1                    py36_0  
nbformat                  4.4.0            py36h827af21_0  
ncurses                   6.1                  h0a44026_0  
notebook                  5.5.0                    py36_0  
numpy                     1.14.5           py36h648b28d_4  
numpy-base                1.14.5           py36ha9ae307_4  
openssl                   1.0.2o               h26aff7b_0  
packaging                 17.1                     py36_0  
pandas                    0.23.2           py36h6440ff4_0  
pandoc                    2.2.1                h1a437c5_0  
pandocfilters             1.4.2                    py36_1  
parsimonious              0.8.1                     <pip>
parso                     0.2.1                    py36_0  
patsy                     0.5.0                    py36_0  
pcre                      8.42                 h378b8a2_0  
pexpect                   4.6.0                    py36_0  
pickleshare               0.7.4            py36hf512f8e_0  
pip                       10.0.1                   py36_0  
prompt_toolkit            1.0.15           py36haeda067_0  
ptyprocess                0.6.0                    py36_0  
py4j                      0.10.6           py36hde0549a_1  
pygments                  2.2.0            py36h240cd3f_0  
pyparsing                 2.2.0                    py36_1  
pyqt                      5.9.2            py36h655552a_0  
pyspark                   2.2.0                    py36_0  
python                    3.6.6                hc167b69_0  
python-dateutil           2.7.3                    py36_0  
pytz                      2018.5                   py36_0  
pyyaml                    3.12             py36h1de35cc_1  
pyzmq                     17.0.0           py36h1de35cc_3  
qt                        5.9.6                h74ce4d9_0  
qtconsole                 4.3.1            py36hd96c0ff_0  
readline                  7.0                  hc1231fa_4  
scipy                     1.1.0            py36hf1f7d93_0  
seaborn                   0.8.1                    py36_0  
send2trash                1.5.0                    py36_0  
setuptools                39.2.0                   py36_0  
simplegeneric             0.8.1                    py36_2  
sip                       4.19.8           py36h0a44026_0  
six                       1.11.0                   py36_1  
sqlite                    3.24.0               ha441bb4_0  
statsmodels               0.9.0            py36h1d22016_0  
terminado                 0.8.1                    py36_1  
testpath                  0.3.1            py36h625a49b_0  
tk                        8.6.7                h35a86e2_3  
tornado                   5.0.2            py36h1de35cc_0  
traitlets                 4.3.2            py36h65bd3ce_0  
wcwidth                   0.1.7            py36h8c6ec74_0  
webencodings              0.5.1                    py36_1  
wheel                     0.31.1                   py36_0  
widgetsnbextension        3.2.1                    py36_0  
xz                        5.2.4                h1de35cc_4  
yaml                      0.1.7                hc338f04_2  
zeromq                    4.2.5                h0a44026_0  
zlib                      1.2.11               hf3cbc9b_2

we’re a bit stumped here. What version of java are you running?

JAVA_VERSION="1.8.0_121"
OS_NAME="Darwin"
OS_VERSION="11.2"
OS_ARCH="x86_64"
SOURCE=" .:f2b5b6ab1f55 corba:386e9b79fcf5 deploy:7130ca3292fd hotspot:90f94521c351 hotspot/make/closed:bb6215e98e28 hotspot/src/closed:5c67a72be91c hotspot/test/closed:262c6cd71fd1 install:709a5016570e jaxp:b8d4e4724071 jaxws:5b8834cc3bb9 jdk:2974746e5619 jdk/make/closed:14736f778a50 jdk/src/closed:784a1cdcf90e jdk/test/closed:75844215d99a langtools:f634736433d9 nashorn:fd548ea7e156 pubs:6c3449393359 sponsors:d32775ed283a"
BUILD_TYPE="commercial"

I’ll try to setup everything from scratch

@tpoterba I downloaded the newest Hail version: devel-56ea3941ae61, and recreated my environment from scratch. still the same error :frowning: I’m reattaching output, just in case smth has changed.

2018-07-17 19:24:51 Hail: INFO: downloading 1KG VCF ...
  Source: https://storage.googleapis.com/hail-tutorial/1kg.vcf.bgz
2018-07-17 19:24:52 Hail: INFO: importing VCF and writing to matrix table...
---------------------------------------------------------------------------
FatalError                                Traceback (most recent call last)
<ipython-input-3-414286e92795> in <module>()
----> 1 hl.utils.get_1kg('data/')

~/hail/python/hail/utils/tutorial.py in get_1kg(output_dir, overwrite)
     66         cluster_readable_vcf = Env.jutils().copyToTmp(jhc, local_path_uri(tmp_vcf), 'vcf')
     67         info('importing VCF and writing to matrix table...')
---> 68         hl.import_vcf(cluster_readable_vcf, min_partitions=16).write(matrix_table_path, overwrite=True)
     69 
     70         tmp_annot = os.path.join(tmp_dir, '1kg_annotations.txt')

~/hail/python/hail/typecheck/check.py in wrapper(*args, **kwargs)
    545         def wrapper(*args, **kwargs):
    546             args_, kwargs_ = check_all(f, args, kwargs, checkers, is_method=is_method)
--> 547             return f(*args_, **kwargs_)
    548 
    549         update_wrapper(wrapper, f)

~/hail/python/hail/matrixtable.py in write(self, output, overwrite, stage_locally, _codec_spec)
   2116         """
   2117 
-> 2118         self._jvds.write(output, overwrite, stage_locally, _codec_spec)
   2119 
   2120     def globals_table(self) -> Table:

/usr/local/spark-2.2.0-bin-hadoop2.7/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py in __call__(self, *args)
   1131         answer = self.gateway_client.send_command(command)
   1132         return_value = get_return_value(
-> 1133             answer, self.gateway_client, self.target_id, self.name)
   1134 
   1135         for temp_arg in temp_args:

~/hail/python/hail/utils/java.py in deco(*args, **kwargs)
    198             raise FatalError('%s\n\nJava stack trace:\n%s\n'
    199                              'Hail version: %s\n'
--> 200                              'Error summary: %s' % (deepest, full, hail.__version__, deepest)) from None
    201         except pyspark.sql.utils.CapturedException as e:
    202             raise FatalError('%s\n\nJava stack trace:\n%s\n'

FatalError: UnsatisfiedLinkError: is.hail.annotations.Region.nativeCtor()V

Java stack trace:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 4 in stage 1.0 failed 1 times, most recent failure: Lost task 4.0 in stage 1.0 (TID 5, localhost, executor driver): java.lang.UnsatisfiedLinkError: is.hail.annotations.Region.nativeCtor()V
	at is.hail.annotations.Region.nativeCtor(Native Method)
	at is.hail.annotations.Region.<init>(Region.scala:35)
	at is.hail.annotations.Region$.apply(Region.scala:15)
	at is.hail.rvd.RVDContext$.default(RVDContext.scala:8)
	at is.hail.rvd.package$RVDContextIsPointed$.point(package.scala:8)
	at is.hail.rvd.package$RVDContextIsPointed$.point(package.scala:6)
	at is.hail.sparkextras.ContextRDD$Weaken$$anonfun$apply$4.apply(ContextRDD.scala:64)
	at is.hail.sparkextras.ContextRDD$Weaken$$anonfun$apply$4.apply(ContextRDD.scala:64)
	at is.hail.sparkextras.ContextRDD.is$hail$sparkextras$ContextRDD$$sparkManagedContext(ContextRDD.scala:129)
	at is.hail.sparkextras.ContextRDD$$anonfun$run$1.apply(ContextRDD.scala:138)
	at is.hail.sparkextras.ContextRDD$$anonfun$run$1.apply(ContextRDD.scala:137)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

Driver stacktrace:
	at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1499)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1487)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1486)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1486)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
	at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:814)
	at scala.Option.foreach(Option.scala:257)
	at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:814)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1714)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1669)
	at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1658)
	at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
	at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:630)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2022)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2043)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2062)
	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2087)
	at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:936)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
	at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
	at org.apache.spark.rdd.RDD.collect(RDD.scala:935)
	at is.hail.sparkextras.ContextRDD.collect(ContextRDD.scala:143)
	at is.hail.rvd.OrderedRVD$.getPartitionKeyInfo(OrderedRVD.scala:574)
	at is.hail.rvd.OrderedRVD$.makeCoercer(OrderedRVD.scala:669)
	at is.hail.io.vcf.MatrixVCFReader.coercer$lzycompute(LoadVCF.scala:972)
	at is.hail.io.vcf.MatrixVCFReader.coercer(LoadVCF.scala:972)
	at is.hail.io.vcf.MatrixVCFReader.apply(LoadVCF.scala:1004)
	at is.hail.expr.ir.MatrixRead.execute(MatrixIR.scala:413)
	at is.hail.expr.ir.Interpret$.apply(Interpret.scala:573)
	at is.hail.expr.ir.Interpret$.apply(Interpret.scala:39)
	at is.hail.expr.ir.Interpret$.apply(Interpret.scala:15)
	at is.hail.variant.MatrixTable.write(MatrixTable.scala:1717)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:280)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:214)
	at java.lang.Thread.run(Thread.java:745)java.lang.UnsatisfiedLinkError: is.hail.annotations.Region.nativeCtor()V
	at is.hail.annotations.Region.nativeCtor(Native Method)
	at is.hail.annotations.Region.<init>(Region.scala:35)
	at is.hail.annotations.Region$.apply(Region.scala:15)
	at is.hail.rvd.RVDContext$.default(RVDContext.scala:8)
	at is.hail.rvd.package$RVDContextIsPointed$.point(package.scala:8)
	at is.hail.rvd.package$RVDContextIsPointed$.point(package.scala:6)
	at is.hail.sparkextras.ContextRDD$Weaken$$anonfun$apply$4.apply(ContextRDD.scala:64)
	at is.hail.sparkextras.ContextRDD$Weaken$$anonfun$apply$4.apply(ContextRDD.scala:64)
	at is.hail.sparkextras.ContextRDD.is$hail$sparkextras$ContextRDD$$sparkManagedContext(ContextRDD.scala:129)
	at is.hail.sparkextras.ContextRDD$$anonfun$run$1.apply(ContextRDD.scala:138)
	at is.hail.sparkextras.ContextRDD$$anonfun$run$1.apply(ContextRDD.scala:137)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
	at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
	at org.apache.spark.scheduler.Task.run(Task.scala:108)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)


Hail version: devel-56ea3941ae61
Error summary: UnsatisfiedLinkError: is.hail.annotations.Region.nativeCtor()V

Ack, thanks for helping us figure this out! We discussed a bit today with no solution. I’ll try again to clone your problem locally tomorrow.

We have someone else reporting this issue with our deployment on Google Cloud. Something nefarious is going on.

We’ve isolated the problem, fix soon!

fix is in and building, should be live tonight.

sorry about that!

no problem. thanks for fixing so attentively.

2018-07-18 22:52:55 Hail: INFO: downloading 1KG VCF ...
  Source: https://storage.googleapis.com/hail-tutorial/1kg.vcf.bgz
2018-07-18 22:52:57 Hail: INFO: importing VCF and writing to matrix table...
2018-07-18 22:52:59 Hail: INFO: Coerced sorted dataset
2018-07-18 22:53:02 Hail: INFO: wrote 10961 items in 16 partitions to data/1kg.mt
2018-07-18 22:53:02 Hail: INFO: downloading 1KG annotations ...
  Source: https://storage.googleapis.com/hail-tutorial/1kg_annotations.txt
2018-07-18 22:53:03 Hail: INFO: Done!

yey!