- You mis-typed the code snippet, it is:
sc = pyspark.SparkContext() - The original error, as noted here, also affects some computers connected to WiFi.
- The original error suggests setting
spark.driver.bindAddress. You should try setting this. It is a spark property, so you can set it usinghl.init'sspark_confkeyword argument. Try setting it to an IP address that Spark can actually bind to. For example, try:0.0.0.0or127.0.0.1. - I assume, because you cannot edit
/etc/hosts, that you do not haverootaccess to your machine. I suggest you ask whoever maintains that machine to help you getpysparkworking. PySpark is a dependency of Hail. Until the twopysparklines that John posted work, you will be unable to use Hail.