UnsatisfiedLinkError during identify_by_descent - incompatible architecture (have 'x86_64', need 'arm64')

I have been working on an analysis with Hail some time but have received a new error message while trying to run identity_by_descent. I’ve never previously had it working with my current set up.

A minimum reproducible example below, though I suspect this is going to be a problem with the install of some dependency.

I’m working on an M2 Mac and installed Hail via pip3.

  • Hail version: version 0.2.126-ee77707f4fab
  • Java version:
    java version “1.8.0_381”
    Java™ SE Runtime Environment (build 1.8.0_381-b09)
    Java HotSpot™ 64-Bit Server VM (build 25.381-b09, mixed mode)
import os
# Point to Java 8 which is required by Hail
os.environ['JAVA_HOME'] = '/Library/Java/JavaVirtualMachines/jdk-1.8.jdk/Contents/Home'
os.environ['PYSPARK_SUBMIT_ARGS'] = "--driver-memory 400G pyspark-shell"
import hail as hl
hl.init(local='local[*]')

mt = hl.read_matrix_table('data/1kg.mt')
ibd = hl.identity_by_descent(mt, min=0.125)

This results in the following error:

FatalError                                Traceback (most recent call last)
Cell In[6], line 1
----> 1 ibd_pop = hl.identity_by_descent(mt, min=0.125)

File <decorator-gen-1696>:2, in identity_by_descent(dataset, maf, bounded, min, max)

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/typecheck/check.py:587, in _make_dec.<locals>.wrapper(__original_func, *args, **kwargs)
    584 @decorator
    585 def wrapper(__original_func: Callable[..., T], *args, **kwargs) -> T:
    586     args_, kwargs_ = check_all(__original_func, args, kwargs, checkers, is_method=is_method)
--> 587     return __original_func(*args_, **kwargs_)

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/methods/relatedness/identity_by_descent.py:114, in identity_by_descent(dataset, maf, bounded, min, max)
    105 dataset = require_biallelic(dataset, 'ibd')
    107 if isinstance(Env.backend(), SparkBackend):
    108     return Table(ir.MatrixToTableApply(dataset._mir, {
    109         'name': 'IBD',
    110         'mafFieldName': '__maf' if maf is not None else None,
    111         'bounded': bounded,
    112         'min': min,
    113         'max': max,
--> 114     })).persist()
    116 min = min or 0
    117 max = max or 1

File <decorator-gen-1246>:2, in persist(self, storage_level)

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/typecheck/check.py:587, in _make_dec.<locals>.wrapper(__original_func, *args, **kwargs)
    584 @decorator
    585 def wrapper(__original_func: Callable[..., T], *args, **kwargs) -> T:
    586     args_, kwargs_ = check_all(__original_func, args, kwargs, checkers, is_method=is_method)
--> 587     return __original_func(*args_, **kwargs_)

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/table.py:2112, in Table.persist(self, storage_level)
   2076 @typecheck_method(storage_level=storage_level)
   2077 def persist(self, storage_level='MEMORY_AND_DISK') -> 'Table':
   2078     """Persist this table in memory or on disk.
   2079 
   2080     Examples
   (...)
   2110         Persisted table.
   2111     """
-> 2112     return Env.backend().persist(self)

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/backend/backend.py:289, in Backend.persist(self, dataset)
    287 from hail.context import TemporaryFilename
    288 tempfile = TemporaryFilename(prefix=f'persist_{type(dataset).__name__}')
--> 289 persisted = dataset.checkpoint(tempfile.__enter__())
    290 self._persisted_locations[persisted] = (tempfile, dataset)
    291 return persisted

File <decorator-gen-1236>:2, in checkpoint(self, output, overwrite, stage_locally, _codec_spec, _read_if_exists, _intervals, _filter_intervals)

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/typecheck/check.py:587, in _make_dec.<locals>.wrapper(__original_func, *args, **kwargs)
    584 @decorator
    585 def wrapper(__original_func: Callable[..., T], *args, **kwargs) -> T:
    586     args_, kwargs_ = check_all(__original_func, args, kwargs, checkers, is_method=is_method)
--> 587     return __original_func(*args_, **kwargs_)

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/table.py:1332, in Table.checkpoint(self, output, overwrite, stage_locally, _codec_spec, _read_if_exists, _intervals, _filter_intervals)
   1329 hl.current_backend().validate_file(output)
   1331 if not _read_if_exists or not hl.hadoop_exists(f'{output}/_SUCCESS'):
-> 1332     self.write(output=output, overwrite=overwrite, stage_locally=stage_locally, _codec_spec=_codec_spec)
   1333     _assert_type = self._type
   1334     _load_refs = False

File <decorator-gen-1238>:2, in write(self, output, overwrite, stage_locally, _codec_spec)

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/typecheck/check.py:587, in _make_dec.<locals>.wrapper(__original_func, *args, **kwargs)
    584 @decorator
    585 def wrapper(__original_func: Callable[..., T], *args, **kwargs) -> T:
    586     args_, kwargs_ = check_all(__original_func, args, kwargs, checkers, is_method=is_method)
--> 587     return __original_func(*args_, **kwargs_)

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/table.py:1378, in Table.write(self, output, overwrite, stage_locally, _codec_spec)
   1352 """Write to disk.
   1353 
   1354 Examples
   (...)
   1373     If ``True``, overwrite an existing file at the destination.
   1374 """
   1376 hl.current_backend().validate_file(output)
-> 1378 Env.backend().execute(ir.TableWrite(self._tir, ir.TableNativeWriter(output, overwrite, stage_locally, _codec_spec)))

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/backend/backend.py:180, in Backend.execute(self, ir, timed)
    178     result, timings = self._rpc(ActionTag.EXECUTE, payload)
    179 except FatalError as e:
--> 180     raise e.maybe_user_error(ir) from None
    181 if ir.typ == tvoid:
    182     value = None

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/backend/backend.py:178, in Backend.execute(self, ir, timed)
    176 payload = ExecutePayload(self._render_ir(ir), '{"name":"StreamBufferSpec"}', timed)
    177 try:
--> 178     result, timings = self._rpc(ActionTag.EXECUTE, payload)
    179 except FatalError as e:
    180     raise e.maybe_user_error(ir) from None

File /Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/site-packages/hail/backend/py4j_backend.py:213, in Py4JBackend._rpc(self, action, payload)
    211 if resp.status_code >= 400:
    212     error_json = orjson.loads(resp.content)
--> 213     raise fatal_error_from_java_error_triplet(error_json['short'], error_json['expanded'], error_json['error_id'])
    214 return resp.content, resp.headers.get('X-Hail-Timings', '')

FatalError: UnsatisfiedLinkError: dlopen(/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp, 0x0009): tried: '/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp' (no such file), '/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))

Java stack trace:
java.lang.UnsatisfiedLinkError: dlopen(/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp, 0x0009): tried: '/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp' (no such file), '/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))
	at com.sun.jna.Native.open(Native Method)
	at com.sun.jna.NativeLibrary.loadLibrary(NativeLibrary.java:298)
	at com.sun.jna.NativeLibrary.getInstance(NativeLibrary.java:483)
	at com.sun.jna.Native.register(Native.java:1774)
	at com.sun.jna.Native.register(Native.java:1493)
	at is.hail.methods.IBSFFI$.<init>(IBSFFI.scala:17)
	at is.hail.methods.IBSFFI$.<clinit>(IBSFFI.scala)
	at is.hail.methods.IBD$.$anonfun$computeIBDMatrix$10(IBD.scala:244)
	at scala.runtime.java8.JFunction0$mcB$sp.apply(JFunction0$mcB$sp.java:23)
	at scala.Array$.fill(Array.scala:354)
	at is.hail.methods.IBD$.computeIBDMatrix(IBD.scala:244)
	at is.hail.methods.IBD.execute(IBD.scala:348)
	at is.hail.expr.ir.functions.WrappedMatrixToTableFunction.execute(RelationalFunctions.scala:52)
	at is.hail.expr.ir.TableToTableApply.execute(TableIR.scala:3377)
	at is.hail.expr.ir.TableIR.analyzeAndExecute(TableIR.scala:59)
	at is.hail.expr.ir.Interpret$.run(Interpret.scala:864)
	at is.hail.expr.ir.Interpret$.alreadyLowered(Interpret.scala:58)
	at is.hail.expr.ir.LowerOrInterpretNonCompilable$.evaluate$1(LowerOrInterpretNonCompilable.scala:20)
	at is.hail.expr.ir.LowerOrInterpretNonCompilable$.rewrite$1(LowerOrInterpretNonCompilable.scala:58)
	at is.hail.expr.ir.LowerOrInterpretNonCompilable$.apply(LowerOrInterpretNonCompilable.scala:63)
	at is.hail.expr.ir.lowering.LowerOrInterpretNonCompilablePass$.transform(LoweringPass.scala:77)
	at is.hail.expr.ir.lowering.LoweringPass.$anonfun$apply$3(LoweringPass.scala:26)
	at is.hail.utils.ExecutionTimer.time(ExecutionTimer.scala:81)
	at is.hail.expr.ir.lowering.LoweringPass.$anonfun$apply$1(LoweringPass.scala:26)
	at is.hail.utils.ExecutionTimer.time(ExecutionTimer.scala:81)
	at is.hail.expr.ir.lowering.LoweringPass.apply(LoweringPass.scala:24)
	at is.hail.expr.ir.lowering.LoweringPass.apply$(LoweringPass.scala:23)
	at is.hail.expr.ir.lowering.LowerOrInterpretNonCompilablePass$.apply(LoweringPass.scala:72)
	at is.hail.expr.ir.lowering.LoweringPipeline.$anonfun$apply$1(LoweringPipeline.scala:22)
	at is.hail.expr.ir.lowering.LoweringPipeline.$anonfun$apply$1$adapted(LoweringPipeline.scala:20)
	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
	at is.hail.expr.ir.lowering.LoweringPipeline.apply(LoweringPipeline.scala:20)
	at is.hail.expr.ir.CompileAndEvaluate$._apply(CompileAndEvaluate.scala:50)
	at is.hail.backend.spark.SparkBackend._execute(SparkBackend.scala:517)
	at is.hail.backend.spark.SparkBackend.$anonfun$execute$4(SparkBackend.scala:546)
	at is.hail.utils.ExecutionTimer.time(ExecutionTimer.scala:81)
	at is.hail.backend.spark.SparkBackend.$anonfun$execute$3(SparkBackend.scala:542)
	at is.hail.backend.spark.SparkBackend.$anonfun$execute$3$adapted(SparkBackend.scala:541)
	at is.hail.backend.ExecuteContext$.$anonfun$scoped$3(ExecuteContext.scala:76)
	at is.hail.utils.package$.using(package.scala:657)
	at is.hail.backend.ExecuteContext$.$anonfun$scoped$2(ExecuteContext.scala:76)
	at is.hail.utils.package$.using(package.scala:657)
	at is.hail.annotations.RegionPool$.scoped(RegionPool.scala:17)
	at is.hail.backend.ExecuteContext$.scoped(ExecuteContext.scala:62)
	at is.hail.backend.spark.SparkBackend.$anonfun$withExecuteContext$3(SparkBackend.scala:368)
	at is.hail.utils.ExecutionTimer$.time(ExecutionTimer.scala:52)
	at is.hail.utils.ExecutionTimer$.logTime(ExecutionTimer.scala:59)
	at is.hail.backend.spark.SparkBackend.$anonfun$withExecuteContext$2(SparkBackend.scala:364)
	at is.hail.backend.spark.SparkBackend.execute(SparkBackend.scala:541)
	at is.hail.backend.BackendHttpHandler.handle(BackendServer.scala:81)
	at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79)
	at sun.net.httpserver.AuthFilter.doFilter(AuthFilter.java:83)
	at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:82)
	at sun.net.httpserver.ServerImpl$Exchange$LinkHandler.handle(ServerImpl.java:833)
	at com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:79)
	at sun.net.httpserver.ServerImpl$Exchange.run(ServerImpl.java:802)
	at sun.net.httpserver.ServerImpl$DefaultExecutor.execute(ServerImpl.java:201)
	at sun.net.httpserver.ServerImpl$Dispatcher.handle(ServerImpl.java:551)
	at sun.net.httpserver.ServerImpl$Dispatcher.run(ServerImpl.java:516)
	at java.lang.Thread.run(Thread.java:750)



Hail version: 0.2.126-ee77707f4fab
Error summary: UnsatisfiedLinkError: dlopen(/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp, 0x0009): tried: '/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp' (no such file), '/Users/abgane/Library/Caches/JNA/temp/jna2578860459691539901.tmp' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))

I can’t see this issue reported for Hail and suspect it is related to a problem with installation. Some people seem to have had success installing other python packages specifying:

export ARCHFLAGS="arm64"

If I remove and reinstall Hail with this there is no change.

I have not had this problem with any other command including ld_pruning and running PCA.

cat `which java` | file - 
/dev/stdin: Mach-O universal binary with 2 architectures: [x86_64:Mach-O 64-bit executable x86_64] [arm64e:Mach-O 64-bit executable arm64e]

If anyone has advice on what might be causing the error or how to resolve I would greatly appreciate it!

Thanks,
Angus

Hey @abg ! This is our fault, we need to build Hail for M2 Macs. I’ve created an issue [query] Hail needs to include a universal SO that works on M2s · Issue #14000 · hail-is/hail · GitHub

In the meantime, you can use hl.king, hl.pc_relate, or use a Linux machine.

I pull request to fix this is now posted: [query] universal dylibs for OS X (and update prebuilt) by danking · Pull Request #14006 · hail-is/hail · GitHub

This will be included in 0.2.127.