How to find the peak number of core in a process

Hello,

I am running “hl.experimental.run_combiner” in GCP using ‘summit [python job]’. On process, I want to check the number of cores like I was able to check through the graph when using ‘notebook’. Please let me know how to get this information.

I’m not aware of a super nice way to interrogate the Spark cluster history on Google Dataproc. If you’re using autoscaling, maybe the Google Cloud Console has some similar chart? You might do some searching around the wider internet, since this isn’t Hail-related, but is a usability question about running Spark on Google Dataproc in general.

Hi Tim, thank you for letting me know this information. I thought it was related to Hail because the graph is the output of the hail function. Thanks again.