Running hail locally - number of cores

Hail running on Apache Spark in local mode will use all available cores by default, but you can load the Spark WebUI (the URL is printed in initialization) and look at the Executors pane to ensure everything is getting used. Using hl.init(master='local[96]') can explicitly request 96 local cores.

You should also make sure that you set memory with PYSPARK_SUBMIT_ARGS: