Sparkcontext error on Databricks

I am trying to use hail on Databricks. I get the following error.

In Databricks, developers should utilize the shared SparkContext instead of creating one using the constructor. In Scala and Python notebooks, the shared context can be accessed as sc. When running a job, you can access the shared context by calling SparkContext.getOrCreate().

I tried to use SparkContext.getOrCreate() and pass sc to hail.init(sc=sc) but continue to get the above error. Any pointers on how this could be resolved?

Thanks for your help.

Hi @hail_q !

I think you forgot to include the error. Could you include that?

Disregard, there’s a bug in the latest released version of Hail. I’ll make a fix and release a new version. Sorry about that

In the meantime, you can use

from hail.context import init_spark
init_spark(sc=sc)
1 Like

Here’s the PR for the fix. [query] fix init(sc=sc): pass the spark context to init_spark by danking · Pull Request #11828 · hail-is/hail · GitHub

Thank you for the quick response :slightly_smiling_face: