Does it run on M1 mac?

I tried several things to get this to work locally on a m1 Mac but I keep running into the max-recursion error when trying to initialise hl.init(sc).

Has anyone saved this problem?

The steps I do are as follows:

  1. initialize pyspark using the following command:
/opt/spark-3.3.3-bin-hadoop3/bin/pyspark --jars /Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/hail-all-spark.jar --conf spark.serializer=org.apache.spark.serializer.KryoSerializer --conf spark.kryo.registrator=is.hail.kryo.HailKryoRegistrator --conf spark.driver.extraClassPath=/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/hail-all-spark.jar --conf spark.executor.extraClassPath=./hail-all-spark.jar

Python 3.11.5 (main, Sep 11 2023, 08:31:25) [Clang 14.0.6 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.
SLF4J: No SLF4J providers were found.
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See https://www.slf4j.org/codes.html#noProviders for further details.
SLF4J: Class path contains SLF4J bindings targeting slf4j-api versions 1.7.x or earlier.
SLF4J: Ignoring binding found at [jar:file:/opt/spark-3.3.3-bin-hadoop3/jars/log4j-slf4j-impl-2.17.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See https://www.slf4j.org/codes.html#ignoredBindings for an explanation.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 3.3.3
      /_/

Using Python version 3.11.5 (main, Sep 11 2023 08:31:25)
Spark context Web UI available at http://bens-mbp:4040
Spark context available as 'sc' (master = local[*], app id = local-1702262616064).
SparkSession available as 'spark'.
  1. Then I try to initialize hail by passing in the reference to sc.
>>> import hail as hl
>>> hl.init(sc)
pip-installed Hail requires additional configuration options in Spark referring
  to the path to the Hail Python module directory HAIL_DIR,
  e.g. /path/to/python/site-packages/hail:
    spark.jars=HAIL_DIR/backend/hail-all-spark.jar
    spark.driver.extraClassPath=HAIL_DIR/backend/hail-all-spark.jar
    spark.executor.extraClassPath=./hail-all-spark.jarRunning on Apache Spark version 3.3.3
......
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/spark_backend.py", line 208, in __init__
    self._jbackend = hail_package.backend.spark.SparkBackend.apply(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/spark-3.3.3-bin-hadoop3/python/lib/py4j-0.10.9.5-src.zip/py4j/java_gateway.py", line 1321, in __call__
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/py4j_backend.py", line 33, in deco
    tpl = Env.jutils().handleForPython(e.java_exception)
          ^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 58, in jutils
    return Env.py4j_backend('Env.jutils').utils_package_object()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 93, in py4j_backend
    b = Env.backend()
        ^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 88, in backend
    return Env.hc()._backend
           ^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 66, in hc
    init()
  File "<decorator-gen-1734>", line 2, in init
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/typecheck/check.py", line 584, in wrapper
    return __original_func(*args_, **kwargs_)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/context.py", line 323, in init
    backend = choose_backend(backend)
              ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 9, in choose_backend
    return configuration_of('query', 'backend', backend, 'spark')
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hailtop/config/user_config.py", line 72, in configuration_of
    from_user_config = get_user_config().get(section, option, fallback=None)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/configparser.py", line 797, in get
    d = self._unify_values(section, vars)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/configparser.py", line 1168, in _unify_values
    raise NoSectionError(section) from None
          ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/configparser.py", line 186, in __init__
    Error.__init__(self, 'No section: %r' % (section,))
  File "/Users/bensaini/anaconda3/lib/python3.11/configparser.py", line 174, in __init__
    Exception.__init__(self, msg)
RecursionError: maximum recursion depth exceeded while calling a Python object

It fails with the above-shown error.

@enriquea or @danielgoldstein - would you be generous to spare some time for a Zoom call to help me fix this? I will be happy to pay you for your time and help.

Thanks,
-Ben

We don’t support this kind of initialization. Use ipython or python as described here: Hail | Your First Hail Query Do not use pyspark directly.

In the future, please always include the full stack trace.

@danking - thank you for your response. Please see the full stack trace with the suggested way.

(base) bensaini@Bens-MacBook-Pro ~ %
(base) bensaini@Bens-MacBook-Pro ~ % python --version
Python 3.11.5
(base) bensaini@Bens-MacBook-Pro ~ % cat test_hail.py
import hail as hl
mt = hl.balding_nichols_model(n_populations=3,
                              n_samples=10,
                              n_variants=100)
mt.show()

(base) bensaini@Bens-MacBook-Pro ~ % python test_hail.py >> test_hail.log 2>&1

test_hail.log contents are below.

Initializing Hail with default parameters...
SLF4J: No SLF4J providers were found.
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See https://www.slf4j.org/codes.html#noProviders for further details.
SLF4J: Class path contains SLF4J bindings targeting slf4j-api versions 1.7.x or earlier.
SLF4J: Ignoring binding found at [jar:file:/opt/spark-3.3.3-bin-hadoop3/jars/log4j-slf4j-impl-2.17.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See https://www.slf4j.org/codes.html#ignoredBindings for an explanation.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Running on Apache Spark version 3.3.3
SparkUI available at http://bens-mbp:4040
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
``` ...contd

… contd

The following error repeats several times and then ends with the last line.

Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Initializing Hail with default parameters...
Traceback (most recent call last):
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/py4j_backend.py", line 25, in deco
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/py4j/protocol.py", line 326, in get_return_value
    raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling o34.exists.
: java.net.ConnectException: Call From Bens-MBP/192.168.132.162 to localhost:9000 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
	at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:913)
	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:828)
	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1616)
	at org.apache.hadoop.ipc.Client.call(Client.java:1558)
	at org.apache.hadoop.ipc.Client.call(Client.java:1455)
	at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:242)
	at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:129)
	at com.sun.proxy.$Proxy33.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:965)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
	at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
	at com.sun.proxy.$Proxy34.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1739)
	at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1753)
	at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1750)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1765)
	at is.hail.io.fs.HadoopFS.fileStatus(HadoopFS.scala:189)
	at is.hail.io.fs.FS.exists(FS.scala:465)
	at is.hail.io.fs.FS.exists$(FS.scala:463)
	at is.hail.io.fs.HadoopFS.exists(HadoopFS.scala:81)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:282)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.ConnectException: Connection refused
	at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
	at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:716)
	at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205)
	at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586)
	at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:711)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:833)
	at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:414)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1677)
	at org.apache.hadoop.ipc.Client.call(Client.java:1502)
	... 36 more


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/py4j_backend.py", line 25, in deco
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/py4j/protocol.py", line 326, in get_return_value
    raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling z:is.hail.backend.spark.SparkBackend.apply.
: java.lang.IllegalArgumentException: requirement failed
	at scala.Predef$.require(Predef.scala:268)
	at is.hail.backend.spark.SparkBackend$.apply(SparkBackend.scala:232)
	at is.hail.backend.spark.SparkBackend.apply(SparkBackend.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:282)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
	at java.lang.Thread.run(Thread.java:748)


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/py4j_backend.py", line 25, in deco
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/py4j/protocol.py", line 326, in get_return_value
    raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling z:is.hail.backend.spark.SparkBackend.apply.
: java.lang.IllegalArgumentException: requirement failed
	at scala.Predef$.require(Predef.scala:268)
	at is.hail.backend.spark.SparkBackend$.apply(SparkBackend.scala:232)
	at is.hail.backend.spark.SparkBackend.apply(SparkBackend.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:282)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
	at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
	at java.lang.Thread.run(Thread.java:748)


During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/py4j_backend.py", line 25, in deco
    return f(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/py4j/protocol.py", line 326, in get_return_value
.....

  File "<decorator-gen-1734>", line 2, in init
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/typecheck/check.py", line 584, in wrapper
    return __original_func(*args_, **kwargs_)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/context.py", line 353, in init
    return init_spark(
           ^^^^^^^^^^^
  File "<decorator-gen-1736>", line 2, in init_spark
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/typecheck/check.py", line 584, in wrapper
    return __original_func(*args_, **kwargs_)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/context.py", line 436, in init_spark
    backend = SparkBackend(
              ^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/spark_backend.py", line 208, in __init__
    self._jbackend = hail_package.backend.spark.SparkBackend.apply(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/py4j/java_gateway.py", line 1321, in __call__
    return_value = get_return_value(
                   ^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/py4j_backend.py", line 33, in deco
    tpl = Env.jutils().handleForPython(e.java_exception)
          ^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 58, in jutils
    return Env.py4j_backend('Env.jutils').utils_package_object()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 93, in py4j_backend
    b = Env.backend()
        ^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 88, in backend
    return Env.hc()._backend
           ^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 66, in hc
    init()
  File "<decorator-gen-1734>", line 2, in init
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/typecheck/check.py", line 584, in wrapper
    return __original_func(*args_, **kwargs_)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/context.py", line 353, in init
    return init_spark(
           ^^^^^^^^^^^
  File "<decorator-gen-1736>", line 2, in init_spark
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/typecheck/check.py", line 584, in wrapper
    return __original_func(*args_, **kwargs_)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/context.py", line 436, in init_spark
    backend = SparkBackend(
              ^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/spark_backend.py", line 208, in __init__
    self._jbackend = hail_package.backend.spark.SparkBackend.apply(
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/py4j/java_gateway.py", line 1321, in __call__
    return_value = get_return_value(
                   ^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/py4j_backend.py", line 33, in deco
    tpl = Env.jutils().handleForPython(e.java_exception)
          ^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 58, in jutils
    return Env.py4j_backend('Env.jutils').utils_package_object()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 93, in py4j_backend
    b = Env.backend()
        ^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 88, in backend
    return Env.hc()._backend
           ^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/utils/java.py", line 66, in hc
    init()
  File "<decorator-gen-1734>", line 2, in init
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/typecheck/check.py", line 584, in wrapper
    return __original_func(*args_, **kwargs_)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/context.py", line 353, in init
    return init_spark(
           ^^^^^^^^^^^
  File "<decorator-gen-1736>", line 2, in init_spark
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/typecheck/check.py", line 584, in wrapper
    return __original_func(*args_, **kwargs_)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/context.py", line 436, in init_spark
    backend = SparkBackend(
              ^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/hail/backend/spark_backend.py", line 141, in __init__
    conf = pyspark.SparkConf()
           ^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/pyspark/conf.py", line 131, in __init__
    self._jconf = _jvm.SparkConf(loadDefaults)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/py4j/java_gateway.py", line 1571, in __call__
    (new_args, temp_args) = self._get_args(args)
                            ^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/py4j/java_gateway.py", line 1556, in _get_args
    if converter.can_convert(arg):
       ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/bensaini/anaconda3/lib/python3.11/site-packages/py4j/java_collections.py", line 490, in can_convert
    return isinstance(object, Set)
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen abc>", line 119, in __instancecheck__
RecursionError: maximum recursion depth exceeded in comparison

@danking - please note that the errors in my last and first posts are the same.

The reason why I didn’t use ipython first is that the sheer length of the error log clears out most of the initial commands and error logs from the terminal window.

Thanks,
-Ben

For posterity, let me share how I resolved the issue. Since this is not already mentioned in the get-started documentation.

Besides install Spark and Hadoop on the local machine, I had to do the following.

Ensure Spark is shutdown by calling $SPARK_HOME/sbin/stop-all.sh

hadoop namenode -config configures the Hadoop’s master node on the local machine.

hadoop namenode will start the Hadoop instance on the local machine.

You should then see a server listening in on the 9000 port. Leave it running in that terminal window.

Run the hail test command from another terminal window in ipython or python. You should then see something like the following.

Initializing Hail with default parameters...
SLF4J: No SLF4J providers were found.
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See https://www.slf4j.org/codes.html#noProviders for further details.
SLF4J: Class path contains SLF4J bindings targeting slf4j-api versions 1.7.x or earlier.
SLF4J: Ignoring binding found at [jar:file:/opt/spark-3.3.3-bin-hadoop3/jars/log4j-slf4j-impl-2.17.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See https://www.slf4j.org/codes.html#ignoredBindings for an explanation.
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Running on Apache Spark version 3.3.3
SparkUI available at http://bens-mbp:4040
Welcome to
     __  __     <>__
    / /_/ /__  __/ /
   / __  / _ `/ / /
  /_/ /_/\_,_/_/_/   version 0.2.120-f00f916faf78
LOGGING: writing to /Users/bensaini/hail-20231214-1616-0.2.120-f00f916faf78.log
2023-12-14 16:16:26.257 Hail: INFO: balding_nichols_model: generating genotypes for 3 populations, 10 samples, and 100 variants...
[Stage 0:>                                                          (0 + 4) / 4]
+---------------+------------+------+------+------+------+
| locus         | alleles    | 0.GT | 1.GT | 2.GT | 3.GT |
+---------------+------------+------+------+------+------+
| locus<GRCh37> | array<str> | call | call | call | call |
+---------------+------------+------+------+------+------+
| 1:1           | ["A","C"]  | 0/1  | 0/0  | 0/1  | 0/0  |
| 1:2           | ["A","C"]  | 1/1  | 1/1  | 1/1  | 1/1  |
| 1:3           | ["A","C"]  | 0/1  | 0/1  | 1/1  | 0/1  |
| 1:4           | ["A","C"]  | 0/1  | 0/0  | 0/1  | 0/0  |
| 1:5           | ["A","C"]  | 0/1  | 0/1  | 0/1  | 0/0  |
| 1:6           | ["A","C"]  | 1/1  | 1/1  | 1/1  | 1/1  |
| 1:7           | ["A","C"]  | 0/1  | 1/1  | 1/1  | 0/1  |
| 1:8           | ["A","C"]  | 0/0  | 0/1  | 0/1  | 0/0  |
| 1:9           | ["A","C"]  | 0/0  | 0/0  | 0/0  | 0/0  |
| 1:10          | ["A","C"]  | 0/0  | 0/0  | 0/1  | 0/1  |
+---------------+------------+------+------+------+------+
showing top 10 rows
showing the first 4 of 10 columns

Good luck.