GCP jupyter error

Hello

I’ve updated to the latest version of hail, by running pip uninstall followed by pip-install hail.

hail version: 0.2.108-fc03e9d5dc08

I create a cluster using the following command:

hailctl dataproc start my-hail-cluster --optional-components=JUPYTER --enable-component-gateway --master-machine-type n1-standard-8 --worker-machine-type n1-standard-4 --bucket my_bucket --project myproject --region us-central1 --requester-pays-allow-all --properties spark:spark.driver.maxResultSize=8g,spark:spark.executor.memory=4g

When I open jupyter link: I get the following error:
Server error: Internal server error: module ‘google.auth.credentials’ has no attribute ‘CredentialsWithTokenUri’

I have used hail like this multiple times so I’m not sure why this has come up. The bucket and cluster are in the same region. If I initiate a non-hail cluster this issue does not occur.

Any help on this would be highly appreciated.

Hello again
Any help with this would be appreciated.

Thank you

Don’t use --optional-components=JUPYTER. You probably also do not need the --enable-component-gateway.

hailctl automatically configures a version of Jupyter which works properly with Hail. You’re adding a different version of Jupyter which is, apparently, incompatible.

Thanks for the clarification.

How can I then connect to jupyter? I’ve tried running external_ip:8132 and internal_ip:8132. But it doesn’t seem to connect.

Take a look at these docs.

Thank you

I ran the following command on the master node of my cluster:
hailctl dataproc connect my-cluster notebook --zone us-central1-c

And I get the following error:

ERROR: (gcloud.compute.ssh) Could not fetch resource:

  • Request had insufficient authentication scopes.

I tried authenticating using google auth login however, the authentication doesn’t go through.

Apologies for the continuous headache, your support is greatly appreciated!

Your user account lacks sufficient privilege to SSH to VMs which is how hailctl connects you to your notebook. Take a look at the suggestion in this stack overflow answer.