No space left on device - export_vcf

Hi there,

I’m trying to export a vds to VCF but I’m getting an "hail.java.FatalError: IOException: No space left on device
" error.

I have plenty of space on the drive I’m writing to, so not sure where the limitation maybe.

vds.write("/Burden_data/WGS_CHD_Burden_samples.vds",overwrite=True) #
vds.export_vcf("/Burden_data/WGS_CHD_Burden_samples.vcf.bgz") #

The vds.write is fine and I can see the vds file is created (file size 547MB), but the export_vcf fails. Given the size of the vds, the VCF should not be that large.

In starting my spark-submit I set a driver-memory of 22G
and I have pointed my $SPARK_WORKER_DIR and $SPARK_LOCAL_DIRS to filespace of 400GB.

hope someone can help.
Thanks for your assistance in advance.
Eddie

What’s the hail temp dir (passed on hail context)?

I have a vanilla HailContext()

hc = HailContext() #

I see from the doc that it defaults to /tmp

class hail.HailContext(sc=None, app_name=‘Hail’, master=None, local=‘local[*]’, log=‘hail.log’, quiet=False, append=False, parquet_compression=‘snappy’, min_block_size=1, branching_factor=50, tmp_dir=’/tmp’)

I will amend it to a specific location.

thanks.

yep, that would do it. export_vcf needs to write file shards before it serially concatenates them all together, and I think the shards were blowing out /tmp.