I unfortunately still get an error. Here is the complete output:
Initializing Hail with default parameters...
19:30:22.650 [main] WARN org.apache.spark.util.Utils - Your hostname, MacBook-Air-3.local resolves to a loopback address: 127.0.0.1; using 172.26.143.71 instead (on interface en0)
19:30:22.656 [main] WARN org.apache.spark.util.Utils - Set SPARK_LOCAL_IP if you need to bind to another address
19:30:23.074 [main] DEBUG o.a.spark.util.ShutdownHookManager - Adding shutdown hook
19:30:23.111 [main] DEBUG org.apache.hadoop.util.Shell - Failed to detect a valid hadoop home directory
java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:468) [hadoop-common-3.2.0.jar:na]
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:439) [hadoop-common-3.2.0.jar:na]
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:516) [hadoop-common-3.2.0.jar:na]
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78) [hadoop-common-3.2.0.jar:na]
at org.apache.hadoop.conf.Configuration.getTimeDurationHelper(Configuration.java:1814) [hadoop-common-3.2.0.jar:na]
at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1791) [hadoop-common-3.2.0.jar:na]
at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183) [hadoop-common-3.2.0.jar:na]
at org.apache.hadoop.util.ShutdownHookManager$HookEntry.<init>(ShutdownHookManager.java:207) [hadoop-common-3.2.0.jar:na]
at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:302) [hadoop-common-3.2.0.jar:na]
at org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:181) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:153) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.util.ShutdownHookManager$.<init>(ShutdownHookManager.scala:58) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.util.ShutdownHookManager$.<clinit>(ShutdownHookManager.scala) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.util.Utils$.createTempDir(Utils.scala:326) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:343) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048) [spark-core_2.12-3.1.2.jar:3.1.2]
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [spark-core_2.12-3.1.2.jar:3.1.2]
19:30:23.135 [main] DEBUG org.apache.hadoop.util.Shell - setsid is not available on this machine. So not using it.
19:30:23.135 [main] DEBUG org.apache.hadoop.util.Shell - setsid exited with exit code 0
19:30:23.320 [main] DEBUG o.a.h.m.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[Rate of successful kerberos logins and latency (milliseconds)])
19:30:23.323 [main] DEBUG o.a.h.m.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[Rate of failed kerberos logins and latency (milliseconds)])
19:30:23.323 [main] DEBUG o.a.h.m.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[GetGroups])
19:30:23.324 [main] DEBUG o.a.h.m.lib.MutableMetricsFactory - field private org.apache.hadoop.metrics2.lib.MutableGaugeLong org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailuresTotal with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[Renewal failures since startup])
19:30:23.325 [main] DEBUG o.a.h.m.lib.MutableMetricsFactory - field private org.apache.hadoop.metrics2.lib.MutableGaugeInt org.apache.hadoop.security.UserGroupInformation$UgiMetrics.renewalFailures with annotation @org.apache.hadoop.metrics2.annotation.Metric(always=false, sampleName=Ops, valueName=Time, about=, interval=10, type=DEFAULT, value=[Renewal failures since last successful login])
19:30:23.327 [main] DEBUG o.a.h.m.impl.MetricsSystemImpl - UgiMetrics, User and group related metrics
19:30:23.348 [main] DEBUG o.a.hadoop.security.SecurityUtil - Setting hadoop.security.token.service.use_ip to true
19:30:23.373 [main] DEBUG org.apache.hadoop.security.Groups - Creating new Groups object
19:30:23.376 [main] DEBUG o.a.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library...
19:30:23.377 [main] DEBUG o.a.hadoop.util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
19:30:23.377 [main] DEBUG o.a.hadoop.util.NativeCodeLoader - java.library.path=/Users/rgs/Library/Java/Extensions:/Library/Java/Extensions:/Network/Library/Java/Extensions:/System/Library/Java/Extensions:/usr/lib/java:.
19:30:23.377 [main] WARN o.a.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
19:30:23.378 [main] DEBUG o.a.hadoop.util.PerformanceAdvisory - Falling back to shell based
19:30:23.380 [main] DEBUG o.a.h.s.JniBasedUnixGroupsMappingWithFallback - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
19:30:23.486 [main] DEBUG org.apache.hadoop.security.Groups - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
19:30:23.506 [main] DEBUG o.a.h.security.UserGroupInformation - hadoop login
19:30:23.507 [main] DEBUG o.a.h.security.UserGroupInformation - hadoop login commit
19:30:23.516 [main] DEBUG o.a.h.security.UserGroupInformation - using local user:UnixPrincipal: rgs
19:30:23.517 [main] DEBUG o.a.h.security.UserGroupInformation - Using user: "UnixPrincipal: rgs" with name rgs
19:30:23.517 [main] DEBUG o.a.h.security.UserGroupInformation - User entry: "rgs"
19:30:23.517 [main] DEBUG o.a.h.security.UserGroupInformation - UGI loginUser:rgs (auth:SIMPLE)
19:30:23.611 [main] DEBUG org.apache.hadoop.fs.FileSystem - Loading filesystems
19:30:23.626 [main] DEBUG org.apache.hadoop.fs.FileSystem - hdfs:// = class org.apache.hadoop.hdfs.DistributedFileSystem from /Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/pyspark/jars/hadoop-hdfs-client-3.2.0.jar
19:30:23.646 [main] DEBUG org.apache.hadoop.fs.FileSystem - webhdfs:// = class org.apache.hadoop.hdfs.web.WebHdfsFileSystem from /Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/pyspark/jars/hadoop-hdfs-client-3.2.0.jar
19:30:23.646 [main] DEBUG org.apache.hadoop.fs.FileSystem - swebhdfs:// = class org.apache.hadoop.hdfs.web.SWebHdfsFileSystem from /Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/pyspark/jars/hadoop-hdfs-client-3.2.0.jar
19:30:23.651 [main] DEBUG org.apache.hadoop.fs.FileSystem - nullscan:// = class org.apache.hadoop.hive.ql.io.NullScanFileSystem from /Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/pyspark/jars/hive-exec-2.3.7-core.jar
19:30:23.668 [main] DEBUG org.apache.hadoop.fs.FileSystem - file:// = class org.apache.hadoop.fs.LocalFileSystem from /Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/pyspark/jars/hadoop-common-3.2.0.jar
19:30:23.669 [main] DEBUG org.apache.hadoop.fs.FileSystem - file:// = class org.apache.hadoop.hive.ql.io.ProxyLocalFileSystem from /Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/pyspark/jars/hive-exec-2.3.7-core.jar
19:30:23.677 [main] DEBUG org.apache.hadoop.fs.FileSystem - viewfs:// = class org.apache.hadoop.fs.viewfs.ViewFileSystem from /Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/pyspark/jars/hadoop-common-3.2.0.jar
19:30:23.680 [main] DEBUG org.apache.hadoop.fs.FileSystem - har:// = class org.apache.hadoop.fs.HarFileSystem from /Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/pyspark/jars/hadoop-common-3.2.0.jar
19:30:23.683 [main] DEBUG org.apache.hadoop.fs.FileSystem - http:// = class org.apache.hadoop.fs.http.HttpFileSystem from /Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/pyspark/jars/hadoop-common-3.2.0.jar
19:30:23.684 [main] DEBUG org.apache.hadoop.fs.FileSystem - https:// = class org.apache.hadoop.fs.http.HttpsFileSystem from /Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/pyspark/jars/hadoop-common-3.2.0.jar
19:30:23.685 [main] DEBUG org.apache.hadoop.fs.FileSystem - Looking for FS supporting file
19:30:23.685 [main] DEBUG org.apache.hadoop.fs.FileSystem - looking for configuration option fs.file.impl
19:30:23.696 [main] DEBUG org.apache.hadoop.fs.FileSystem - Looking in service filesystems for implementation class
19:30:23.696 [main] DEBUG org.apache.hadoop.fs.FileSystem - FS for file is class org.apache.hadoop.hive.ql.io.ProxyLocalFileSystem
19:30:23.808 [main] INFO org.apache.spark.SecurityManager - Changing view acls to: rgs
19:30:23.808 [main] INFO org.apache.spark.SecurityManager - Changing modify acls to: rgs
19:30:23.809 [main] INFO org.apache.spark.SecurityManager - Changing view acls groups to:
19:30:23.810 [main] INFO org.apache.spark.SecurityManager - Changing modify acls groups to:
19:30:23.811 [main] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(rgs); groups with view permissions: Set(); users with modify permissions: Set(rgs); groups with modify permissions: Set()
19:30:23.949 [main] DEBUG o.a.s.api.python.PythonGatewayServer - Started PythonGatewayServer on port 59202
2021-09-30 19:30:24 WARN Hail:43 - This Hail JAR was compiled for Spark 3.1.1, running with Spark 3.1.2.
Compatibility is not guaranteed.
19:30:24.243 [Thread-4] INFO org.apache.spark.SparkContext - Running Spark version 3.1.2
19:30:24.300 [Thread-4] INFO o.a.spark.resource.ResourceUtils - ==============================================================
19:30:24.300 [Thread-4] INFO o.a.spark.resource.ResourceUtils - No custom resources configured for spark.driver.
19:30:24.301 [Thread-4] INFO o.a.spark.resource.ResourceUtils - ==============================================================
19:30:24.301 [Thread-4] INFO org.apache.spark.SparkContext - Submitted application: Hail
19:30:24.308 [Thread-4] INFO org.apache.spark.SparkContext - Spark configuration:
spark.app.name=Hail
spark.app.startTime=1633026624243
spark.driver.extraClassPath=/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/backend/hail-all-spark.jar
spark.driver.maxResultSize=0
spark.executor.extraClassPath=./hail-all-spark.jar
spark.hadoop.io.compression.codecs=org.apache.hadoop.io.compress.DefaultCodec,is.hail.io.compress.BGzipCodec,is.hail.io.compress.BGzipCodecTbi,org.apache.hadoop.io.compress.GzipCodec
spark.hadoop.mapreduce.input.fileinputformat.split.minsize=0
spark.jars=file:///Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/backend/hail-all-spark.jar
spark.kryo.registrator=is.hail.kryo.HailKryoRegistrator
spark.kryoserializer.buffer.max=1g
spark.logConf=true
spark.master=local[*]
spark.repl.local.jars=file:///Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/backend/hail-all-spark.jar
spark.serializer=org.apache.spark.serializer.KryoSerializer
spark.submit.deployMode=client
spark.submit.pyFiles=
spark.ui.showConsoleProgress=false
19:30:24.345 [Thread-4] INFO o.a.spark.resource.ResourceProfile - Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
19:30:24.363 [Thread-4] INFO o.a.spark.resource.ResourceProfile - Limiting resource is cpu
19:30:24.363 [Thread-4] INFO o.a.s.r.ResourceProfileManager - Added ResourceProfile id: 0
19:30:24.448 [Thread-4] INFO org.apache.spark.SecurityManager - Changing view acls to: rgs
19:30:24.449 [Thread-4] INFO org.apache.spark.SecurityManager - Changing modify acls to: rgs
19:30:24.449 [Thread-4] INFO org.apache.spark.SecurityManager - Changing view acls groups to:
19:30:24.449 [Thread-4] INFO org.apache.spark.SecurityManager - Changing modify acls groups to:
19:30:24.449 [Thread-4] INFO org.apache.spark.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(rgs); groups with view permissions: Set(); users with modify permissions: Set(rgs); groups with modify permissions: Set()
19:30:24.837 [Thread-4] DEBUG o.a.s.network.server.TransportServer - Shuffle server started on port: 59204
19:30:24.847 [Thread-4] INFO org.apache.spark.util.Utils - Successfully started service 'sparkDriver' on port 59204.
19:30:24.863 [Thread-4] DEBUG org.apache.spark.SparkEnv - Using serializer: class org.apache.spark.serializer.KryoSerializer
19:30:24.894 [Thread-4] INFO org.apache.spark.SparkEnv - Registering MapOutputTracker
19:30:24.894 [Thread-4] DEBUG o.a.s.MapOutputTrackerMasterEndpoint - init
19:30:24.969 [Thread-4] INFO org.apache.spark.SparkEnv - Registering BlockManagerMaster
19:30:25.017 [Thread-4] INFO o.a.s.s.BlockManagerMasterEndpoint - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
19:30:25.018 [Thread-4] INFO o.a.s.s.BlockManagerMasterEndpoint - BlockManagerMasterEndpoint up
19:30:25.023 [Thread-4] INFO org.apache.spark.SparkEnv - Registering BlockManagerMasterHeartbeat
19:30:25.060 [Thread-4] INFO o.a.spark.storage.DiskBlockManager - Created local directory at /private/var/folders/sm/795yv_kj01z2spgzwlq0syth0000gn/T/blockmgr-61101b16-4322-4f44-9e36-fc0d6f379d05
19:30:25.062 [Thread-4] DEBUG o.a.spark.storage.DiskBlockManager - Adding shutdown hook
19:30:25.104 [Thread-4] INFO o.a.spark.storage.memory.MemoryStore - MemoryStore started with capacity 366.3 MiB
19:30:25.137 [Thread-4] INFO org.apache.spark.SparkEnv - Registering OutputCommitCoordinator
19:30:25.138 [Thread-4] DEBUG o.a.s.s.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - init
19:30:25.168 [Thread-4] DEBUG org.apache.spark.SecurityManager - Created SSL options for ui: SSLOptions{enabled=false, port=None, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
Traceback (most recent call last):
File "<stdin>", line 4, in <module>
File "<decorator-gen-1741>", line 2, in balding_nichols_model
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/typecheck/check.py", line 576, in wrapper
args_, kwargs_ = check_all(__original_func, args, kwargs, checkers, is_method=is_method)
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/typecheck/check.py", line 543, in check_all
args_.append(arg_check(args[i], name, arg_name, checker))
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/typecheck/check.py", line 584, in arg_check
return checker.check(arg, function_name, arg_name)
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/typecheck/check.py", line 82, in check
return tc.check(x, caller, param)
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/typecheck/check.py", line 328, in check
return f(tc.check(x, caller, param))
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/genetics/reference_genome.py", line 10, in <lambda>
reference_genome_type = oneof(transformed((str, lambda x: hl.get_reference(x))), rg_type)
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/context.py", line 554, in get_reference
Env.hc()
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/utils/java.py", line 55, in hc
init()
File "<decorator-gen-1821>", line 2, in init
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/typecheck/check.py", line 577, in wrapper
return __original_func(*args_, **kwargs_)
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/context.py", line 252, in init
skip_logging_configuration, optimizer_iterations)
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/hail/backend/spark_backend.py", line 174, in __init__
jsc, app_name, master, local, True, min_block_size, tmpdir, local_tmpdir)
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/py4j/java_gateway.py", line 1305, in __call__
answer, self.gateway_client, self.target_id, self.name)
File "/Users/rgs/opt/miniconda3/envs/conda-for-hail/lib/python3.7/site-packages/py4j/protocol.py", line 328, in get_return_value
format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling z:is.hail.backend.spark.SparkBackend.apply.
: java.lang.NoSuchMethodError: org.slf4j.helpers.MessageFormatter.arrayFormat(Ljava/lang/String;[Ljava/lang/Object;)Lorg/slf4j/helpers/FormattingTuple;
at org.sparkproject.jetty.util.log.JettyAwareLogger.log(JettyAwareLogger.java:624)
at org.sparkproject.jetty.util.log.JettyAwareLogger.info(JettyAwareLogger.java:314)
at org.sparkproject.jetty.util.log.Slf4jLog.info(Slf4jLog.java:77)
at org.sparkproject.jetty.util.log.Log.initialized(Log.java:169)
at org.sparkproject.jetty.util.log.Log.getLogger(Log.java:276)
at org.sparkproject.jetty.util.log.Log.getLogger(Log.java:265)
at org.sparkproject.jetty.util.component.AbstractLifeCycle.<clinit>(AbstractLifeCycle.java:36)
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:117)
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:104)
at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:89)
at org.apache.spark.ui.WebUI.$anonfun$attachTab$1(WebUI.scala:70)
at org.apache.spark.ui.WebUI.$anonfun$attachTab$1$adapted(WebUI.scala:70)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:70)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:60)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:81)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:183)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:478)
at is.hail.backend.spark.SparkBackend$.configureAndCreateSparkContext(SparkBackend.scala:146)
at is.hail.backend.spark.SparkBackend$.apply(SparkBackend.scala:222)
at is.hail.backend.spark.SparkBackend.apply(SparkBackend.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
19:30:25.560 [main] DEBUG o.a.s.api.python.PythonGatewayServer - Exiting due to broken pipe from Python driver
(conda-for-hail) MacBook-Air-3:~ rgs$ 19:30:25.564 [shutdown-hook-0] INFO o.a.spark.storage.DiskBlockManager - Shutdown hook called
19:30:25.580 [shutdown-hook-0] INFO o.a.spark.util.ShutdownHookManager - Shutdown hook called
19:30:25.580 [shutdown-hook-0] INFO o.a.spark.util.ShutdownHookManager - Deleting directory /private/var/folders/sm/795yv_kj01z2spgzwlq0syth0000gn/T/spark-1882f0f2-aad1-40e4-a272-ba3e80661af9
19:30:25.585 [shutdown-hook-0] INFO o.a.spark.util.ShutdownHookManager - Deleting directory /private/var/folders/sm/795yv_kj01z2spgzwlq0syth0000gn/T/spark-53cac155-6e4c-4c78-93f7-b3bb65dc427b/userFiles-389708a4-5749-4a3e-bd3a-c0cc3cecccb7
19:30:25.590 [shutdown-hook-0] INFO o.a.spark.util.ShutdownHookManager - Deleting directory /private/var/folders/sm/795yv_kj01z2spgzwlq0syth0000gn/T/spark-53cac155-6e4c-4c78-93f7-b3bb65dc427b
19:30:25.597 [Thread-1] DEBUG o.a.hadoop.util.ShutdownHookManager - Completed shutdown in 0.033 seconds; Timeouts: 0
19:30:25.603 [Thread-1] DEBUG o.a.hadoop.util.ShutdownHookManager - ShutdownHookManger completed shutdown.