- You mis-typed the code snippet, it is:
sc = pyspark.SparkContext()
- The original error, as noted here, also affects some computers connected to WiFi.
- The original error suggests setting
spark.driver.bindAddress
. You should try setting this. It is a spark property, so you can set it usinghl.init
'sspark_conf
keyword argument. Try setting it to an IP address that Spark can actually bind to. For example, try:0.0.0.0
or127.0.0.1
. - I assume, because you cannot edit
/etc/hosts
, that you do not haveroot
access to your machine. I suggest you ask whoever maintains that machine to help you getpyspark
working. PySpark is a dependency of Hail. Until the twopyspark
lines that John posted work, you will be unable to use Hail.