"No space left" error with Hail installed using conda

Hi,
I know this error has been asked and answered but it still not fixes my problem.

I have an account on the computing cluster and the space limitted for each user in his/her home folder is 10 MB. I sucessfully install Hail and download the Gnomad data. But to process it, I got an error “hail.utils.java.FatalError: IOException: No space left on device”. From a search over internet and in this forum, I need to config my spark home local folder to a folder with enough space. However, I installed Hail using conda inside an conda environment. I could not figure out where I can change spark home local folder. Could you please help me? Many thanks!

Set tmp_dir in hl.init to a directory with plenty of space.

1 Like

Thanks. It fixed my error. But I got another error “java.lang.OutOfMemoryError: Java heap space”. Can we also set a parameter in the init function to fix this?

Thanks danking. The solution from olavur below works for me.

import hail as hl
hl.init(spark_conf={‘spark.driver.memory’: ‘100g’})