Download large file from gs with time-out error

I am trying to download a set of 132 data files from Terra workspace GS location. I will have no issue download each one of them from the google cloud console, gs listing site by point and click. But I have a hard time download them if I combined all of them as a tar.gz file.
I event broken those files into two tar.gz files, one is 2.1GB, the other is 4.7 GB. If I download them from the browser, it always run into time-out error (estimated at 4-5 hours downloading time). What other options I have to download files of this size?

Hi @truckload ,

Can you tell us a bit more about how you’re trying to download the combined tar.gz file? Are you using glcoud commands or trying to use the GCP interface?

Thanks!
Ava

I am trying to use the GCP interface. It run into time-out problem. BTW, the GCP interface still suggest using gsutil …
I am not running gcloud commands from my laptop, because my poweshell does not allow me to run gcloud authentication. I am behind NIH network, that may be the main issue. Otherwise, I have to ask biowulf to see if they can allow me to run gcloud from the terminal …
I am also wondering if the data uploader tools on Terra can make the file transfer more streamlined?
Thank you.

In general, Google command line tools will be the most direct/efficient way to move from AnVIL → Institutional computing.

It seems reasonable that NIH should support gcloud storage (or gsutil) for interfacing with AnVIL, so it might be worth emailing Josh Doss and/or the IT team.

In the meantime, if you already have the tar.gz file created, you could also try downloading from the AnVIL/Terra side (although you might have the same timeout issue). Example: