If you have your logs archive in Google Cloud bucket this is how you can query them:
- use BigQuery (if you have permissions)
- download log files locally and query them on your machine
- use Cloud Shell (this is what I recommend to do)
Using Cloud Shell is the best way to go imo because you are not downloading log files to your own machine (which may take a lot of time) whereas you are doing that while being logged in to some machine inside Google Cloud infrastructure. This gives you the best download speed possible.
So this is my current flow:

- open Cloud Shell (the button for that is located in the top right corner in GC Console)
- you should see a terminal-like widget at the bottom of you screen
- now you can run
gsutil
commands in that terminal and dump found results in a temporary files - after that you can download those files to your local machine

Here is an example of how you can query several files and dump results in a temp file:
gsutil -m cat gs://prod-apps_logs/backend/2020/06/29/*.json | fgrep 96b397ca5bf8 > 0629.log
After running this command you will get a temp file 0629.log
in your home directory. You can download it to your local machine for further analysis.
Hint: if your older logs are gzipped — use zcat
:
gsutil -m cat gs://data/prod/2019/06/01/log__2019_06_01.json.gz | zcat | fgrep 96b397ca5bf8 > 190601.log
Cheers