One thing I don’t understand why hail (0.2.83) only allows Scala 2.12.x with the Spark version is 3.x, and when Spark version is 2.4.x, it forces to use Scala 2.11.x? But does Spark 2.4.x requires Scala 2.12.x? e.g. 2.4.x. What’s the best option if we can’t access spark 3.x?
Hey Simon,
If I remember correctly, early versions of Spark 2.4.x required Scala 2.11. Then some patch update made it possible to use Scala 2.12. By the time of Spark 2.4.8, you should be able to use Scala 2.12. In fact, on the latest release of hail, I believe we now only support Scala 2.12 to keep things simpler, since that works with Spark 2.4.8 and all the different Spark 3s. Is your desire to use Spark 2.4.8 with Scala 2.12? Because by updating I think you can do that.
Hi Johnc1231,
Thanks for you reply. I understood you only want to support Scala 2.12, however, if we use Spark 2.4.8 and Scala 2.12, it can’t pass the hail build check.
That used to be true. But you’re on hail 0.2.83, which isn’t the latest version. Newer versions can be built with Spark 2.4.8 and Scala 2.12
I see. Thanks for the clarification!
I am curious why you have to use Spark 2 though. It’s EOL now, and we likely won’t actively support it for too much longer. Is this an on prem cluster or something?
For latest Hail, it requires the ElasticSearch to be 7 and 8, we are still in 6. It’s hard to just upgrade everything at a time.