Issue running Spark locally: java.net.BindException

  1. You mis-typed the code snippet, it is: sc = pyspark.SparkContext()
  2. The original error, as noted here, also affects some computers connected to WiFi.
  3. The original error suggests setting spark.driver.bindAddress. You should try setting this. It is a spark property, so you can set it using hl.init's spark_conf keyword argument. Try setting it to an IP address that Spark can actually bind to. For example, try: 0.0.0.0 or 127.0.0.1.
  4. I assume, because you cannot edit /etc/hosts, that you do not have root access to your machine. I suggest you ask whoever maintains that machine to help you get pyspark working. PySpark is a dependency of Hail. Until the two pyspark lines that John posted work, you will be unable to use Hail.