Comment by mrbluecoat
Comment by mrbluecoat 4 days ago
So is it just a wrapper around MitM Proxy?
Comment by mrbluecoat 4 days ago
So is it just a wrapper around MitM Proxy?
Curious to see how you can get Gemini fully intercepted.
I've been intercepting its HTTP requests by running it inside a docker container with:
-e HTTP_PROXY=http://127.0.0.1:8080 -e HTTPS_PROXY=http://host.docker.internal:8080 -e NO_PROXY=localhost,127.0.0.1
It was working with mitmproxy for a very brief period, then the TLS handshake started failing and it kept requesting for re-authentication when proxied.
You can get the whole auth flow and initial conversation starters using Burp Suite and its certificate, but the Gemini chat responses fail in the CLI, which I understand is due to how Burp handles HTTP2 (you can see the valid responses inside Burp Suite).
Kind of yes... But with a nice cli so that you don't have to set it up just run "sherlock claude" and "sherlock start" on two terminals and everything that claude sends in that session then it will be stored. So no proxy set up or anything, just simple terminal commands. :)
> So is it just a wrapper around MitM Proxy?
Yes.
I created something similar months ago [*] but using Envoy Proxy [1], mkcert [2], my own Go (golang) server, and Little Snitch [3]. It works quite well. I was the first person to notice that Codex CLI now sends telemetry to ab.chatgpt.com and other curiosities like that, but I never bothered to open-source my implementation because I know that anyone genuinely interested could easily replicate it in an afternoon with their favourite Agent CLI.
[1] https://www.envoyproxy.io/
[2] https://github.com/FiloSottile/mkcert
[3] https://www.obdev.at/products/littlesnitch/
[*] In reality, I created this something like 6 years ago, before LLMs were popular, originally as a way to inspect all outgoing HTTP(s) traffic from all the apps installed in my macOS system. Then, a few months ago, when I started using Codex CLI, I made some modifications to inspect Agent CLI calls too.