Connect gcloud machine to bucket

Hi, I am a little confused on how I connect a gloud machine to my ukb bucket. I am trying to run a megamem but am currently testing the set up process using a n1-standard-8 machine.
This is the process I am using:

gcloud compute instances create lkptest --machine-type n1-standard-8
gcloud compute ssh lkptest

Then I do an ls but nothing is there.
How do I access the bucket where my code and data is stored?

My guess that hail dataproc automatically connects somehow, because when I use hailctl dataproc start and hailctl dataproc connect ... notebook the bucket appears on the notebook.

Once I get this working, I probably also need to pip install several packages, such as hail, numpy, scipy, and matplotlib. Could you also offer direction on that? I tried the command found at the following link, but I got several errors.

The Jupyter provided by hailctl dataproc is using special software to read and edit Jupyter Notebooks stored in buckets. You could set this up on your instance if you wanted to, but I might recommend instead keeping the files on your laptop and then running them like this:

gcloud compute scp --recursive my-source-code-dir/ lkptest:.
gcloud compute ssh -- python3 my-source-code-dir/

That copies your code to the machine then runs python remotely on the machine but pipes the output back to your laptop. You can edit the code on your laptop, commit it to GitHub, etc. while still running it remotely.

What errors did you get?