You can do that locally or on your cloud shell. I would recommend using some vm on google or amazon, cause cloud shell will terminate your session after some minutes... which is annoying for bigger datasets.
Before you begin, connect to your virtual machine (using SSH) and install the google utils (you can find a tutorial here).
You will also have to install and configure amazons awscli-tool. This can be done link this:
sudo apt install awscli aws configure
Now you can copy files from s3 to google like this:
gsutil -m rsync -r s3://bucket/path/ gs://anotherbucket
On google cloud shell I still got the following error:
ERROR 0127 10:05:02.767142 utils.py] Unable to read instance data, giving up Failure: No handler was ready to authenticate. 4 handlers were checked. ['HmacAuthV1Handler', 'DevshellAuth', 'OAuth2Auth', 'OAuth2ServiceAccountAuth'] Check your credentials.
But setting the key & secret manually solved the problem:
export AWS_ACCESS_KEY_ID="aws_access_key_id" export AWS_SECRET_ACCESS_KEY="aws_secret_access_key"