Trouble downloading public 1000 Genomes data

I’m trying to follow the tutorial from here: Hail | GWAS Tutorial and I’m getting the following error message:

hl.utils.get_1kg(‘data/’)

2021-08-27 17:20:24 Hail: INFO: downloading 1KG VCF …
Source: https://storage.googleapis.com/hail-tutorial/1kg.vcf.bgz

FatalError Traceback (most recent call last)
in
----> 1 hl.utils.get_1kg(‘data/’)

~\AppData\Roaming\Python\Python37\site-packages\hail\utils\tutorial.py in get_1kg(output_dir, overwrite)
78 f’ Source: {source}’)
79 sync_retry_transient_errors(urlretrieve, resources[‘1kg_matrix_table’], tmp_vcf)
—> 80 cluster_readable_vcf = _copy_to_tmp(fs, local_path_uri(tmp_vcf), extension=‘vcf.bgz’)
81 info(‘importing VCF and writing to matrix table…’)
82 hl.import_vcf(cluster_readable_vcf, min_partitions=16).write(matrix_table_path, overwrite=True)

~\AppData\Roaming\Python\Python37\site-packages\hail\utils\tutorial.py in _copy_to_tmp(fs, src, extension)
38 def _copy_to_tmp(fs, src, extension=None):
39 dst = new_temp_file(extension=extension)
—> 40 fs.copy(src, dst)
41 return dst
42

~\AppData\Roaming\Python\Python37\site-packages\hail\fs\hadoop_fs.py in copy(self, src, dest)
25
26 def copy(self, src: str, dest: str):
—> 27 self._jfs.copy(src, dest, False)
28
29 def exists(self, path: str) → bool:

C:\ProgramData\Anaconda3\lib\site-packages\py4j\java_gateway.py in call(self, *args)
1255 answer = self.gateway_client.send_command(command)
1256 return_value = get_return_value(
→ 1257 answer, self.gateway_client, self.target_id, self.name)
1258
1259 for temp_arg in temp_args:

~\AppData\Roaming\Python\Python37\site-packages\hail\backend\spark_backend.py in deco(*args, **kwargs)
40 raise FatalError(’%s\n\nJava stack trace:\n%s\n’
41 ‘Hail version: %s\n’
—> 42 ‘Error summary: %s’ % (deepest, full, hail.version, deepest)) from None
43 except pyspark.sql.utils.CapturedException as e:
44 raise FatalError(’%s\n\nJava stack trace:\n%s\n’

FatalError: IllegalArgumentException: Wrong FS: file://C:\Users\MAHANT~1\AppData\Local\Temp\tmpvl2wrxhw\1kg.vcf.bgz, expected: file:///

Java stack trace:
java.lang.IllegalArgumentException: Wrong FS: file://C:\Users\MAHA~1\AppData\Local\Temp\tmpvl2wrxhw\1kg.vcf.bgz, expected: file:///
at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:649)
at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:82)
at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:606)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824)
at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:601)
at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:421)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.(ChecksumFileSystem.java:142)
at org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:346)
at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:769)
at is.hail.io.fs.HadoopFS.openNoCompression(HadoopFS.scala:83)
at is.hail.io.fs.FS$class.copy(FS.scala:188)
at is.hail.io.fs.HadoopFS.copy(HadoopFS.scala:70)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Unknown Source)

Hail version: 0.2.56-87d89a1d8159
Error summary: IllegalArgumentException: Wrong FS: file://C:\Users\MAHA~1\AppData\Local\Temp\tmpvl2wrxhw\1kg.vcf.bgz, expected: file:///

Can you please suggest whats the mistake? TIA

Hail isn’t currently known to work on Windows, though I believe people have gotten around this by using Windows Subsystem for Linux.