Comment by TeMPOraL

Comment by TeMPOraL 14 hours ago

0 replies

> In this case I think you would have to disallow any internet communication besides the LLM provider itself, including the underlying host of the sandbox.

And the user too, because a human can also be prompt-injected! Prompt injection is fundamentally just LLM flavor of social engineering.