Comment by kennywinker
Comment by kennywinker a day ago
Except it’s not, because people are using LLMs for things, thinking they can put guardrails on them that will hold.
As an example, I’m thinking of the car dealership chatbot that gave away $1 cars: https://futurism.com/the-byte/car-dealership-ai
If these things are being sold as things that can be locked down, it’s fair game to find holes in those lockdowns.
…and? people do stupid things and face consequences? so what?
I’d also advocate you don’t expose your unsecured database to the public internet