I am trying to use hail on Databricks. I get the following error.
In Databricks, developers should utilize the shared SparkContext instead of creating one using the constructor. In Scala and Python notebooks, the shared context can be accessed as sc. When running a job, you can access the shared context by calling SparkContext.getOrCreate().
I tried to use SparkContext.getOrCreate() and pass sc to hail.init(sc=sc) but continue to get the above error. Any pointers on how this could be resolved?
Thanks for your help.