Exception in thread "Spark Context Cleaner" java.lang.OutOfMemoryError: Java heap space

Actually this suggestion solves it:

" If running Hail locally, memory can be increased by setting the PYSPARK_SUBMIT_ARGS environment variable. For example,

PYSPARK_SUBMIT_ARGS="--driver-memory 4g pyspark-shell" python script.py

How do I increase the memory or RAM available to the JVM when I start Hail through Python?"